Trabajo Final - Visión por computadora 2 - CEIA
¶

📝 Nota:
La funcionalidad de visualización de jupyter notebooks en github es solamente un preview.
Para mejor visualización se sugiere utilizar el visualizador recomndado por la comunidad: nbviewer
Puedes a acceder al sigiente enlace para ver este notebook en dicha página: TrabajoFinalVPC2

Figura 1: Detección de palmeras mediante modelos de aprendizaje profundo.
Figura 1: Detección de palmeras mediante modelos de aprendizaje profundo.
✨Datos del proyecto:✨

Subtitulo Trabajo final - VPC2 - FIUBA
Descrpción Palm tree detector
Integrantes - Bruno Masoller (brunomaso1@gmail.com) - Simón Rodriguez (simon.andre.r@gmail.com)

✋ Importaciones: ✋

Inicialmente, se descargan varios datos desde Google Drive. Los archivos descargados constan de las carpetas datasets y utils. En la carpeta datasets se encuentran los conjuntos de datos y en la carpeta utils, utilidades necesarias, como archivos auxiliares que facilitan la visualización y entrenamiento del modelo.

In [ ]:
# Librerías básicas para el tratamiento de archivos del sistema y procesamiento aleatorio.
import sys, os, warnings, zipfile, shutil ,random, time, json, gc

# Configuraciónes previas.
DOWNLOAD_RESOURCES = True # Permite indicar si descargar el conjunto o no
RUTA_CARPETA_RESOURCES = './resources'
RUTA_ARCHIVO_REQUERIMIENTOS = os.path.join(RUTA_CARPETA_RESOURCES, 'requirements.txt')
RUTA_CARPETA_UTILS = os.path.join(RUTA_CARPETA_RESOURCES, 'utils')

# Esta lógica descarga los recursos necesarios para este proyecto desde Google Drive.
# Si no está disponible el link, se puede acceder desde:
# - Dataset Palmeras: https://www.kaggle.com/datasets/riotulab/aerial-images-of-palm-trees
# - Utilidades de torchvision: https://github.com/pytorch/vision/tree/main/references/detection
if DOWNLOAD_RESOURCES:
    print("Descargando recursos...")
    %pip install gdown
    !gdown https://drive.google.com/drive/folders/1DFfgoGQ8-zLwQU85Uyq0X72GX4IITInV?usp=sharing --folder

# Verificamos si la carpeta ha sido descargada correctamente
if os.path.exists(RUTA_CARPETA_RESOURCES):
    print(f"La carpeta {RUTA_CARPETA_RESOURCES} fue encontrada.")

    # Verificamos si el archivo de requerimientos existe
    if os.path.exists(RUTA_ARCHIVO_REQUERIMIENTOS):
        print("Archivo de requerimientos encontrado, instalando dependencias...")
        %pip install -r {RUTA_ARCHIVO_REQUERIMIENTOS}
    else:
        warnings.warn(f"El archivo {RUTA_ARCHIVO_REQUERIMIENTOS} no fue encontrado. No se instalarán los requerimientos.")

    if os.path.exists(RUTA_CARPETA_UTILS):
        # Se agrega esta ruta al path para que funcionen las utilidades de pytorch.
        sys.path.append(os.path.abspath('./resources/utils'))
        print(f"Se agregó la ruta {RUTA_CARPETA_UTILS} al system path.")
    else:
        raise FileNotFoundError(f"La carpeta {RUTA_CARPETA_UTILS} no fue encontrada. Verifique la descarga.")
else:
    raise FileNotFoundError(f"La carpeta {RUTA_CARPETA_RESOURCES} no fue encontrada. Verifique la descarga.")

# Librería que muestra utilización de la GPU.
from GPUtil import showUtilization as gpu_usage
    
# Librería para TypesHint.
from typing import List, Tuple, Dict, Union, Callable
from enum import Enum

# Librería para el procesamiento numérico.
import numpy as np

# Librería para procesamiento de gráficas.
import matplotlib.pyplot as plt
import matplotlib.patches as patches

# librería de manejo de datos
import pandas as pd

# Librería de deeplearning de pytorch y su análogo para visión por computadora.
import torch, torchvision 
from torch import nn, optim

# Utilidades de pytorch.
from torch.utils.data import DataLoader
from torch.utils.data import Dataset

# Utilidades de torchvision.
from torchvision.transforms import functional as F
from torchvision.transforms import v2 as T
from torchvision.ops import nms
from torchvision.io import read_image
from torchvision.utils import draw_bounding_boxes
from torchvision import tv_tensors

# Librería de aumentacion de datos
import albumentations as A
from albumentations.pytorch import ToTensorV2

# Librería de barritas de progreso
from tqdm.auto import tqdm
from tqdm import trange

# Librería para procesamiento de archivos XML.
import xml.etree.ElementTree as ET

# Librería de utilizadades para el procesamiento de visión por computadora.
import cv2

# Libreria para tratamiento de imágenes
# from PIL import Image

# Librería para visualiación de Datasets.
import fiftyone as fo

# Librería que permite ver el resumen de un modelo de deeplearning.
from torchinfo import summary

import ultralytics
from ultralytics import YOLO

# Librería de utilidades de pytorchvision/detection.
from engine import train_one_epoch, evaluate # type: ignore
from utils import MetricLogger
from coco_eval import CocoEvaluator

ultralytics.checks()

🔧 Configuraciones: 🔧

In [3]:
VERBOSE = True # Muestra época a época la evolución
LOAD_MODEL_FROM_CHECKPOINT = True # Permite cargar el modelo desde un checkpoint.
DEVICE = 'cuda' if torch.cuda.is_available() else 'cpu' # Establece el dispositivo.
RANDOM_SEED = 42
EPOCHS = 100 # Cantidad de épocas genéricas a entrenar.
BATCH_SIZE = 64 # Tamaño de batch genérico.

# Datasets:
PALMERAS_KAGGLE_DATSET = 'Palm-Counting-349images'

# Formatos
PALMERAS_KAGGLE_DATASET_VOC = 'palmeras_kaggle_voc_format'
PALMERAS_KAGGLE_DATASET_COCO = 'palmeras_kaggle_coco_format'
PALMERAS_KAGGLE_DATASET_YOLO = 'palmeras_kaggle_yolo_format'

# Rutas genéricas:
RUTA_CARPETA_COMPRESSED = os.path.join(RUTA_CARPETA_RESOURCES, 'datasets', 'compressed')
RUTA_CARPETA_RAW = os.path.join(RUTA_CARPETA_RESOURCES, 'datasets', 'raw')
RUTA_CARPETA_MODELS = os.path.join(RUTA_CARPETA_RESOURCES, 'models')
RUTA_CARPETA_KAGGLE_DATASET = os.path.join(RUTA_CARPETA_RAW, PALMERAS_KAGGLE_DATSET)

# Rutas dataset formato VOC
RUTA_CARPETA_KAGGLE_DATASET_VOC = os.path.join(RUTA_CARPETA_RAW, PALMERAS_KAGGLE_DATASET_VOC)
RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC, "train")
RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC, "test")
RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_DATA = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN, "data")
RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_LABELS = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN, "labels")
RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST, "data")
RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST, "labels")

# Rutas datset formato yolo
RUTA_CARPETA_KAGGLE_DATASET_YOLO = os.path.join(RUTA_CARPETA_RAW, PALMERAS_KAGGLE_DATASET_YOLO)
RUTA_CARPETA_KAGGLE_DATASET_YOLO_TRAIN = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_YOLO, 'train')
RUTA_CARPETA_KAGGLE_DATASET_YOLO_TEST = os.path.join(RUTA_CARPETA_KAGGLE_DATASET_YOLO, 'test')
RUTA_CARPETA_KAGGLE_DATSET_YOLO_TRAIN_CSV = os.path.join(RUTA_CARPETA_KAGGLE_DATASET, 'train_labels.csv')
RUTA_CARPETA_KAGGLE_DATSET_YOLO_TEST_CSV = os.path.join(RUTA_CARPETA_KAGGLE_DATASET, 'test_labels.csv')

# Configuraciones de imágenes:
FORMATOS_IMAGENES = ('.jpg', '.jpeg', '.png')
FORMATOS_LABELS = ('.xml')
IMG_SIZE = 300

# Etiquetas:
LABELS = {
    'Palm' : 1
}

# Modelos
FASTERRCNN_RESNET50_FPN_V2_MODEL_ID = 'fasterrcnn_resnet50_fpn_v2'
FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG = 'fasterrcnn_resnet50_fpn_v2_sin_aug'
YOLOV8_MODEL_ID = 'yolov8'
YOLOV11_MODEL_ID = 'yolov11'

YOLOV8_CONFIG_YAML = os.path.join(RUTA_CARPETA_UTILS, YOLOV8_MODEL_ID + '.yaml')

TRAIN_TEST_SPLIT_RATIO = [0.8, 0.2]

random.seed(RANDOM_SEED)

def clean_gpu_usage() -> None:
    """Permite mostrar el uso y limpiar el chache de la GPU"""
    print("Initial GPU Usage")
    gpu_usage()
    gc.collect()
    torch.cuda.empty_cache()
    print("GPU Usage after emptying the cache")
    gpu_usage()
    

print(f'Dispositivo actual: {DEVICE}')
Dispositivo actual: cpu

Introduction¶

En el marco de la especialización en Inteligencia Artificial de la Facultad de Ingeniería de la Universidad de Buenos Aires, se plantea, como trabajo final de la materia Visión por Computadora 2, se tiene como objetivo desarrollar un modelo de visión por computadora que utilice aprendizaje profundo para detectar ubicaciones de palmeras en diversas imágenes áereas.

Metodológicamente, se aplican principios de CRISP-ML(Q) para el desarrollo del problema.

Business and data understanding¶

La detección de palmeras es ampliamente estudiado en el área de visión por computadora, dado su potencial como insumo de otras soluciones, como por ejemplo, aquellas que intentan mejorar la salud de las palmeras, donde inicalmente es deseable una detección aera para luego realizar una inspección manual (con el objetivo de reducir costos).

Muchos estudios (como este por ejemplo) se realizan en paises en donde la mayoría de las palmeras son materia prima para la producción del "aceite de palmera". Sin embargo, en paises no productores, el cuidado de las palmeras también es importante, dado su impacto en el medio ambiente.

El objetivo de este notebook es presentar enfoques de modelos que permitan detectar palmeras en diferentes conjuntos de datos.

Data preparation¶

Los datos fueron obenidos del sito de Kaggle: https://www.kaggle.com/datasets/riotulab/aerial-images-of-palm-trees, sin embargo, para una mejor manejo, se los importan desde Google Drive (https://drive.google.com/drive/folders/1DFfgoGQ8-zLwQU85Uyq0X72GX4IITInV).

Data pre-processing¶

Luego de descargado los datos, tenemos la siguiente estructura de directorios:

resources
├── datasets
│   └── compressed
│       └── palmeras_kaggle.zip
└── utils
    ├── engine.py
    ├── train.py
    └── ...

🔮 Función auxiliar:

Dada una carpeta de origen y una carpeta de destino, descomprime todos los datos de la carpeta de origen en la carpeta de destino

In [17]:
def descomprimir_archivos(carpeta_origen: str, carpeta_destino: str) -> None:
    """
    Dada una carpeta de origen y una carpeta de destino, descomprime todos los datos de la carpeta de origen en la carpeta de destino.

    Args:
        carpeta_origen (str): Ruta de la carpeta que contiene los archivos zip.
        carpeta_destino (str): Ruta de la carpeta donde se descomprimirán los archivos.

    Returns:
        None
    """
    archivos: List[str] = [f for f in os.listdir(carpeta_origen) if f.endswith(".zip")]
    if not archivos:
        warnings.warn("No se encontraron archivos .zip para descomprimir.")
        return

    for archivo in archivos:
        ruta_archivo: str = os.path.join(carpeta_origen, archivo)
        print(f"Descomprimiendo {archivo} en {carpeta_destino}...")

        # Descomprimir archivo
        with zipfile.ZipFile(ruta_archivo, "r") as zip_ref:
            zip_ref.extractall(carpeta_destino)
        print(f"{archivo} descomprimido exitosamente.")

Descomprimimos los datos:

In [18]:
# Crear la carpeta 'raw' si no existe
os.makedirs(RUTA_CARPETA_RAW, exist_ok=True)

# Verificar si la carpeta compressed existe
if os.path.exists(RUTA_CARPETA_COMPRESSED):
    descomprimir_archivos(RUTA_CARPETA_COMPRESSED, RUTA_CARPETA_RAW)
else:
    warnings.warn(f"La carpeta {RUTA_CARPETA_COMPRESSED} no fue encontrada. Se descargaron correctamente los datos?")
Descomprimiendo palmeras_kaggle.zip en ./resources\datasets\raw...
palmeras_kaggle.zip descomprimido exitosamente.

Una vez descomprimido los archivos, podemos observar que el conjunto está en formato PascalVOC (aunque también tiene dos archivos .csv que pueden utilizarse para otros formatos, ej: YOLO).

Ajustamos los archvios para tener una carpeta "data" y otra "labels", de la siguiente forma:

palmeras_kaggle_formato_voc/
├── test/
│   ├── data/
│   │   ├── img1.jpg
│   │   ├── img2.jpg
│   │   └── ...
│   └── labels/
│       ├── img1.xml
│       ├── img2.xml
│       └── ...
└── train/
    └── ...

🔮 Función auxiliar:

Mueve archivos de imágenes y etiquetas a sus respectivos directorios de destino.

In [19]:
def mover_archivos(source_dir: str, data_dest: str, label_dest: str) -> None:
    """
    Mueve archivos de imágenes y etiquetas a sus respectivos directorios de destino.

    Args:
        source_dir (str): Directorio fuente que contiene los archivos a mover.
        data_dest (str): Directorio de destino para los archivos de imágenes.
        label_dest (str): Directorio de destino para los archivos de etiquetas.

    Returns:
        None
    """
    for filename in os.listdir(source_dir):
        file_path: str = os.path.join(source_dir, filename)
       
        # Verifica si es un archivo de imagen según los formatos establecidos.
        if filename.endswith(FORMATOS_IMAGENES):
            shutil.move(file_path, os.path.join(data_dest, filename))
        # Verifica si es un archivo de labels (xml comúnmente)
        elif filename.endswith(FORMATOS_LABELS):
            shutil.move(file_path, os.path.join(label_dest, filename))

🔮 Función auxiliar:

Parsea el dataset descomprimido a la estructura Pascal VOC

In [20]:
def parse_to_pascal_voc(desompresed_folder: str) -> None:
    """Parsea el dataset descomprimido a la estructura Pascal VOC"""
    # Define los paths de las carpetas
    base_dir = os.path.join(RUTA_CARPETA_RAW, desompresed_folder)
    train_dir = os.path.join(base_dir, "train")
    test_dir = os.path.join(base_dir, "test")

    if not os.path.exists(base_dir):
        warnings.warn(
            f"No se descomprimió el archivo? No existe la carpeta {base_dir}."
        )
        return None

    if os.path.exists(RUTA_CARPETA_KAGGLE_DATASET_VOC):
        warnings.warn(
            f"Ya existe la carpeta {RUTA_CARPETA_KAGGLE_DATASET_VOC}, no se hizo nada."
        )
        return None

    # Crea las nuevas carpetas si no existen
    os.makedirs(RUTA_CARPETA_KAGGLE_DATASET_VOC, exist_ok=True)
    os.makedirs(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA, exist_ok=True)
    os.makedirs(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS, exist_ok=True)
    os.makedirs(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_DATA, exist_ok=True)
    os.makedirs(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_LABELS, exist_ok=True)

    # Mover archivos de train y test a las nuevas carpetas
    try:
        mover_archivos(
            train_dir,
            RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_DATA,
            RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_LABELS,
        )
        mover_archivos(
            test_dir,
            RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA,
            RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS,
        )
    except:
        warnings.warn(
            f"Error al mover los archivos. Las carpetas {train_dir} y {test_dir} existen? O ya fueron borradas?"
        )

Para ajustar a la estructura necesaria, realizamos el siguiente procesamiento:

  1. Creamos la carpeta.
  2. Movemos los archivos.

In [21]:
parse_to_pascal_voc(PALMERAS_KAGGLE_DATSET)

🔮 Función auxiliar:

Cuenta la cantidad de archivos de imágenes que tiene un directorio

In [22]:
def contar_imagenes(directorio: str) -> int:
    """Cuenta la cantidad de archivos de imágenes que tiene un directorio"""
    return len([f for f in os.listdir(directorio) if f.endswith(FORMATOS_IMAGENES)])

Luego que tenemos la estructura correcta, podemos visualizar la cantidad de ejemplos que tenemos:

In [23]:
print(f'Total de imágenes test: {contar_imagenes(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA)}')
print(f'Total de imágenes train: {contar_imagenes(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_DATA)}')
Total de imágenes test: 70
Total de imágenes train: 279

Si inspeccionamos algunas imágenes, vemos que tienen un tag polygon en el xml (se utilizan usualmente en problemas de segmentación). También el tag path contiene una ruta que no corresponde. Para procesar estos archivos correctamente, se eliminan dichos tags

🔮 Función auxiliar:

Modifica un archivo XML eliminando objetos con etiquetas de polygon y 'path'.

In [24]:
def process_tags(xml_path: str) -> Tuple:
    """
    Modifica un archivo XML eliminando objetos con etiquetas de polygon y 'path'.

    Esta función analiza un archivo XML, elimina todos los objetos que contienen
    una etiqueta polygon y path si existen. Los cambios se guardan en el mismo archivo.

    Args:
        xml_path (str): Ruta al archivo XML que se va a modificar.

    Returns:
        None
    """
    # TODO: Falta cambiar el folder y ajustar el nombre del archivo.
    tree: ET.ElementTree = ET.parse(xml_path)
    root: ET.Element = tree.getroot()
    tag_path_eliminados = 0
    tag_polygon_eliminados = 0

    # Eliminar el tag 'path' si existe
    path_elem = root.find("path")
    if path_elem is not None:
        root.remove(path_elem)
        tag_path_eliminados += 1

    # Encontrar todos los objetos
    for obj in root.findall("object"):
        # Si el objeto tiene un tag polygon, eliminar el objeto
        if obj.find("polygon") is not None:
            root.remove(obj)
            tag_polygon_eliminados += 1

    # Guardar los cambios en el archivo
    tree.write(xml_path)
    return tag_path_eliminados, tag_polygon_eliminados

🔮 Función auxiliar:

Recorre todos los archivos xml y aplica process_tags

In [25]:
def apply_process_tags(base_dir: str, formatos_labels: Tuple):
    total_tag_path_eliminados = 0
    total_tag_polygon_eliminados = 0
    for xml_file in os.listdir(base_dir):
        if xml_file.endswith(formatos_labels):
            xml_file_path = os.path.join(base_dir, xml_file)
            # Aplicar la función a cada archivo XML
            tag_path_eliminados, tag_polygon_eliminados = process_tags(xml_file_path)
            total_tag_path_eliminados += tag_path_eliminados
            total_tag_polygon_eliminados += tag_polygon_eliminados
    return total_tag_path_eliminados, total_tag_polygon_eliminados

Recorremos todos los archivos XML y eliminamos las etiquetas:

In [26]:
total_tag_path_eliminados = 0
total_tag_polygon_eliminados = 0

tag_path_eliminados, tag_polygon_eliminados = apply_process_tags(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS, FORMATOS_LABELS)
total_tag_path_eliminados += tag_path_eliminados
total_tag_polygon_eliminados += tag_polygon_eliminados

tag_path_eliminados, tag_polygon_eliminados = apply_process_tags(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_LABELS, FORMATOS_LABELS)
total_tag_path_eliminados += tag_path_eliminados
total_tag_polygon_eliminados += tag_polygon_eliminados

print(f"Tags polygons eliminados: {total_tag_polygon_eliminados}")
print(f"Tags path eliminados: {total_tag_path_eliminados}")
Tags polygons eliminados: 3
Tags path eliminados: 349

Data visualization¶

🔮 Función auxiliar:

Funciones auxiliares para visualización de las imágenes.

In [30]:
def paint_bounding_boxes(
    ax: plt.Axes, boxes: np.ndarray, labels: list, config: Dict
) -> None:
    """
    Dibuja bounding boxes en el eje especificado.

    Args:
        ax (plt.Axes): Eje de Matplotlib donde se dibujarán las cajas.
        boxes (np.ndarray): Array de coordenadas de las cajas.
        labels (list): Lista de etiquetas correspondientes a cada caja.
        config (Dict): Configuración para el estilo de las cajas y texto.

    Returns:
        None
    """
    for box, label in zip(boxes, labels):
        xmin, ymin, xmax, ymax = box
        rect = patches.Rectangle(
            (xmin, ymin),
            xmax - xmin,
            ymax - ymin,
            linewidth=config["linewidth"],
            edgecolor=config["edgecolor"],
            facecolor="none",
        )
        ax.add_patch(rect)

        if config.get("putText", False):
            ax.text(
                xmin,
                ymin - 10,
                label,
                color=config["color"],
                fontsize=config["fontsize"],
                backgroundcolor=config.get("backgroundcolor", "white"),
            )


def draw_bbox(
    img: Union[str, torch.Tensor],
    target: Union[str, Dict[str, torch.Tensor]],
    ax: plt.Axes,
    config: Dict,
) -> None:
    """
    Dibuja bounding boxes en una imagen desde un archivo o un dataset.

    Args:
        img (Union[str, torch.Tensor]): Ruta de la imagen o tensor de la imagen.
        target_or_annotation_path (Union[str, Dict[str, torch.Tensor]]): Ruta al archivo XML de anotación o diccionario con anotaciones.
        ax (plt.Axes): Eje de Matplotlib donde se dibujará la imagen.
        config (Dict): Configuración para el estilo de las cajas y texto.

    Returns:
        None
    """

    # Validación básica
    if not isinstance(ax, plt.Axes):
        raise ValueError("ax debe ser un objeto plt.Axes")
    if isinstance(img, str):
        image = read_image(img)
    elif isinstance(img, torch.Tensor):
        image = img
    else:
        raise ValueError("img debe ser una cadena o un tensor")

    image_tensor = image.permute(1, 2, 0)

    ax.imshow(image_tensor)
    ax.axis("off")

    if config.get("from_dataset_voc", False):
        boxes = target["boxes"]
        labels = target["labels"]
        paint_bounding_boxes(ax, boxes, labels, config)

    else:
        tree = ET.parse(target)
        root = tree.getroot()

        boxes = []
        labels = []

        for obj in root.findall("object"):
            bbox = obj.find("bndbox")
            xmin = int(bbox.find("xmin").text)
            ymin = int(bbox.find("ymin").text)
            xmax = int(bbox.find("xmax").text)
            ymax = int(bbox.find("ymax").text)

            boxes.append((xmin, ymin, xmax, ymax))
            labels.append(obj.find("name").text)

        paint_bounding_boxes(ax, np.array(boxes), labels, config)


def draw_grid(
    images: List[Union[str, torch.Tensor]],
    targets: List[Union[str, Dict[str, torch.Tensor]]],
    config: Dict,
    grid_size: Tuple[int, int] = (3, 3),
    figsize: Tuple[int, int] = (15, 15),
) -> None:
    """
    Dibuja una grilla de imágenes con bounding boxes.

    Args:
        images (List[Union[str, torch.Tensor]]): Lista de rutas de imágenes o tensores de imágenes.
        targets (List[Union[str, Dict[str, torch.Tensor]]]): Lista de rutas a archivos XML de anotación o diccionarios con anotaciones.
        grid_size (Tuple[int, int], opcional): Tamaño de la grilla (filas, columnas). Por defecto es (3, 3).
        from_dataset (bool, opcional): Indica si los datos provienen de un dataset. Por defecto es False.

    Returns:
        None
    """
    fig, axs = plt.subplots(grid_size[0], grid_size[1], figsize=figsize)

    for i in range(grid_size[0] * grid_size[1]):
        img = images[i]
        target = targets[i]
        ax = axs[i // grid_size[1], i % grid_size[1]]
        draw_bbox(img, target, ax, config)

        # Mostrar el nombre de la imagen si está habilitado en config
        img_name = (
            os.path.basename(target["img_path"])
            if config.get("from_dataset_voc", False)
            else os.path.basename(img)
        )
        if config.get("show_image_name", False):
            ax.annotate(
                img_name,
                xy=(0.5, -0.05),
                xycoords="axes fraction",
                ha="center",
                fontsize=config.get("name_fontsize", 12),
            )

    # Agregar el super título si está definido en config
    if config.get("super_title", ""):
        fig.suptitle(
            config["super_title"],
            fontsize=config.get("super_title_fontsize", 16),
            y=config.get("super_title_y", 0.95),
        )

    plt.tight_layout()
    plt.show()


def show_single_image(
    image: Union[str, torch.Tensor],
    target: Union[str, Dict[str, torch.Tensor]],
    config: Dict,
    figsize: Tuple[int, int] = (8, 8),
) -> None:
    """
    Muestra una sola imagen con sus bounding boxes.

    Args:
        image (Union[str, torch.Tensor]): Ruta de la imagen o tensor de la imagen.
        target (Union[str, Dict[str, torch.Tensor]]): Ruta al archivo XML de anotación o diccionario con anotaciones.
        from_dataset (bool, opcional): Indica si los datos provienen de un dataset. Por defecto es False.

    Returns:
        None
    """
    fig, ax = plt.subplots(figsize=figsize)
    draw_bbox(image, target, ax, config)

    # Mostrar el nombre de la imagen si está habilitado en config
    img_name = (
        os.path.basename(target["img_path"])
        if config.get("from_dataset_voc", False)
        else os.path.basename(image)
    )
    if config.get("show_image_name", False):
        ax.annotate(
            img_name,
            xy=(0.5, -0.05),
            xycoords="axes fraction",
            ha="center",
            fontsize=config.get("name_fontsize", 12),
        )

    # Agregar el super título si está definido en config
    if config.get("super_title", ""):
        fig.suptitle(
            config["super_title"],
            fontsize=config.get("super_title_fontsize", 16),
            y=config.get("super_title_y", 0.95),
        )

    plt.tight_layout()
    plt.show()


def get_image_annotation_paths(
    image_dir: str, label_dir: str, max_files: int = 9
) -> Tuple[List[str], List[str]]:
    """
    Obtiene las rutas de imágenes y sus correspondientes anotaciones de forma aleatoria.

    Args:
        image_dir (str): Directorio que contiene las imágenes.
        label_dir (str): Directorio que contiene las anotaciones XML.
        max_files (int, opcional): Número máximo de archivos a procesar. Por defecto es 9.

    Returns:
        Tuple[List[str], List[str]]: Una tupla conteniendo dos listas:
            - Lista de rutas de imágenes.
            - Lista de rutas de anotaciones XML correspondientes.
    """
    image_files: List[str] = [
        f for f in os.listdir(image_dir) if f.endswith(FORMATOS_IMAGENES)
    ]

    # Selecciona al azar hasta max_files imágenes
    image_files = random.sample(image_files, min(len(image_files), max_files))

    image_paths: List[str] = [os.path.join(image_dir, img) for img in image_files]
    annotation_paths: List[str] = [
        os.path.join(label_dir, os.path.splitext(img)[0] + ".xml")
        for img in image_files
    ]

    return image_paths, annotation_paths


def get_random_image_and_annotation(image_dir: str, label_dir: str) -> Tuple[str, str]:
    """
    Obtiene una imagen aleatoria y su correspondiente anotación.

    Args:
        image_dir (str): Directorio que contiene las imágenes.
        label_dir (str): Directorio que contiene las anotaciones XML.

    Returns:
        Tuple[str, str]: Una tupla conteniendo:
            - Ruta de la imagen seleccionada aleatoriamente.
            - Ruta de la anotación XML correspondiente.
    """
    image_files: List[str] = [
        f for f in os.listdir(image_dir) if f.endswith(FORMATOS_IMAGENES)
    ]
    random_image: str = random.choice(image_files)

    image_path: str = os.path.join(image_dir, random_image)
    annotation_path: str = os.path.join(
        label_dir, os.path.splitext(random_image)[0] + ".xml"
    )

    return image_path, annotation_path

Para iniciar nuestro análisis sobre el conjunto de imágenes, primeramente visualizamos una imagen con sus detecciones:

In [31]:
image_path, annotation_path = get_random_image_and_annotation(
    RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA, RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS
)
config = {
    "linewidth": 2,
    "edgecolor": "r",
    "facecolor": "none",
    "color": "white",
    "fontsize": 8,
    "backgroundcolor": "red",
    "putText": True,
    "show_image_name" : True,
    "name_fontsize" : 10,
    "super_title": "Visualización de un ejemplo de Palmeras anotadas",
    "super_title_fontsize": 16,
    "super_title_y": 0.90,
    "from_dataset_voc" : False,
}
show_single_image(image_path, annotation_path, config)
No description has been provided for this image

Visualizamos también una grilla con varias imágenes y sus detecciones:

In [15]:
image_paths, annotation_paths = get_image_annotation_paths(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA, RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS, max_files=9)
config = {
    "linewidth": 1,
    "edgecolor": "r",
    "facecolor": "none",
    "color": "white",
    "fontsize": 6,
    "backgroundcolor": "red",
    "putText": False,
    "show_image_name" : True,
    "name_fontsize" : 10,
    "super_title": "Visualización de varios ejemplos de Palmeras anotadas",
    "super_title_fontsize": 16,
    "super_title_y": 0.95,
    "from_dataset_voc" : False,
}
draw_grid(image_paths, annotation_paths, config)
No description has been provided for this image

También podemos utilizar la librería de FiftyOne (https://docs.voxel51.com/) para una mejor visualización. Otra ventaja de esta librería es que tiene un módulo de exportación a varios formatos conocidos:

In [16]:
# Cargamos el conjunto.
fiftyone_dataset_train = fo.Dataset.from_dir(
    dataset_type=fo.types.VOCDetectionDataset,
    dataset_dir=RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN,
    name=PALMERAS_KAGGLE_DATASET_VOC + '_train',
    overwrite=True
)

 100% |█████████████████| 279/279 [11.3s elapsed, 0s remaining, 49.3 samples/s]      

Podemos observar los datos del conjunto:

In [17]:
print(fiftyone_dataset_train)
Name:        palmeras_kaggle_voc_format_train
Media type:  image
Num samples: 279
Persistent:  False
Tags:        []
Sample fields:
    id:               fiftyone.core.fields.ObjectIdField
    filepath:         fiftyone.core.fields.StringField
    tags:             fiftyone.core.fields.ListField(fiftyone.core.fields.StringField)
    metadata:         fiftyone.core.fields.EmbeddedDocumentField(fiftyone.core.metadata.ImageMetadata)
    created_at:       fiftyone.core.fields.DateTimeField
    last_modified_at: fiftyone.core.fields.DateTimeField
    ground_truth:     fiftyone.core.fields.EmbeddedDocumentField(fiftyone.core.labels.Detections)

También se puede lanzar la UI de la aplicacion:

session = fo.launch_app(fiftyone_dataset)
session.open_tab()
Figura 2: UI de FiftyOne para el dataset de las palmeras.
Figura 2: UI de FiftyOne para el dataset de las palmeras.

La ventaja de utilizar FiftyOne, es que se puede exportar a varios formatos admitidos (https://docs.voxel51.com/user_guide/export_datasets.html | https://docs.voxel51.com/user_guide/export_datasets.html#custom-dataset-exporter). Como ejemplo, para exportar al formato de COCO sería:

fiftyone_dataset.export(
    export_dir=RUTA_CARPETA_KAGGLE_DATASET_COCO,
    dataset_type=fo.types.COCODetectionDataset,
    label_field="ground_truth",
)

Machine learning model engineering¶

Como experimentos, dado que inicialmente el modelo no tiene como objetivo el funcionamiento en tiempo real, se plantea utilizar modelos de detección en dos etapas (ej: Faster R-CNN). Sin embargo, también se tienen en cuenta modelos de una sola etapa (ej: YOLO)

Faster R-CNN¶

🤖 Consulta a IA (ChatGPT 4o): 🤖

❔ Prompt:

Podrías generar una introducción sobre el modelo Faster R-CNN? Que incluya sus componentes, sus ventajas y desventajas, así como también comparalo con otros modelos del estado del arte.

🔃 Respuesta:

Faster R-CNN (Region-based Convolutional Neural Networks) es un modelo de detección de objetos que destaca por su precisión y eficiencia, especialmente en tareas de detección de múltiples clases en imágenes. Es una evolución de los modelos R-CNN (2014) y Fast R-CNN (2015), integrando mejoras que reducen el tiempo de procesamiento mientras mantienen un alto rendimiento. Faster R-CNN se ha vuelto un estándar en aplicaciones de visión por computadora que requieren detección precisa de objetos, como en vehículos autónomos, vigilancia y reconocimiento de objetos a partir de imágenes aéreas o satelitales.

Componentes de Faster R-CNN

  1. Red Convolucional Base: Faster R-CNN utiliza una red convolucional como base, comúnmente modelos preentrenados como VGG16, ResNet o MobileNet. Estas redes actúan como extractores de características que convierten una imagen de entrada en un mapa de características.

  2. Region Proposal Network (RPN): Uno de los principales avances de Faster R-CNN es la RPN. En lugar de generar propuestas de regiones usando métodos externos como Selective Search (usado en R-CNN y Fast R-CNN), la RPN genera las regiones propuestas (candidatas a contener objetos) directamente dentro de la red. La RPN es una pequeña red convolucional que desliza una ventana sobre el mapa de características, prediciendo posibles "anclas" que podrían contener objetos.

  3. RoI Pooling (Region of Interest Pooling): Después de generar las regiones propuestas, Faster R-CNN aplica una técnica llamada RoI Pooling para adaptar cada región propuesta a un tamaño fijo, lo que permite alimentar estas regiones a una red totalmente conectada (fully connected) para su clasificación y refinamiento de los bounding boxes.

  4. Clasificación y Refinamiento de Bounding Boxes: La última parte del modelo toma cada región propuesta y predice tanto la clase del objeto (o si es fondo) como las coordenadas refinadas del bounding box.

Ventajas de Faster R-CNN

  1. Precisión: Faster R-CNN es uno de los modelos de detección de objetos más precisos, ya que optimiza tanto la propuesta de regiones como la detección de objetos en un solo paso.

  2. Eficiencia: La integración de la Region Proposal Network permite un proceso más eficiente comparado con métodos anteriores, ya que la red aprende a generar propuestas más relevantes sin depender de técnicas externas que incrementan el tiempo de procesamiento.

  3. Flexibilidad: Puede adaptarse a diferentes tareas de detección, desde objetos pequeños en imágenes detalladas hasta objetos más grandes en escenas menos densas.

Desventajas de Faster R-CNN

  1. Velocidad: Aunque es significativamente más rápido que sus predecesores, sigue siendo más lento que modelos como YOLO o SSD, ya que sigue utilizando un enfoque de dos etapas (propuesta de regiones y clasificación).

  2. Complejidad Computacional: Requiere mayor poder computacional que modelos más livianos, lo que lo hace menos adecuado para aplicaciones en tiempo real o en dispositivos con restricciones de hardware.

Comparación con otros Modelos del Estado del Arte

  • YOLO (You Only Look Once): A diferencia de Faster R-CNN, YOLO es un modelo de detección de una sola etapa, lo que significa que realiza la detección de objetos en una sola pasada sobre la imagen. Esto lo hace extremadamente rápido y adecuado para aplicaciones en tiempo real, pero a menudo sacrifica algo de precisión en comparación con Faster R-CNN. YOLO predice directamente las clases y las coordenadas de los objetos desde la imagen completa, lo que es más eficiente pero puede resultar en detecciones menos precisas para objetos pequeños o con formas irregulares.

  • SSD (Single Shot Multibox Detector): Al igual que YOLO, SSD es un modelo de detección de una sola etapa que es más rápido que Faster R-CNN. Sin embargo, SSD tiende a tener un rendimiento ligeramente inferior en la detección de objetos pequeños y es menos preciso cuando se compara con Faster R-CNN en tareas que requieren alta precisión.

  • RetinaNet: Este modelo combina lo mejor de ambos mundos, con un enfoque de una etapa similar a YOLO y SSD, pero utilizando una pérdida focal que pone mayor peso en las clases raras o difíciles de detectar. Aunque es más rápido que Faster R-CNN, sigue siendo algo más lento que YOLO y SSD.

Conclusión

Faster R-CNN es un modelo robusto y preciso que sigue siendo uno de los más utilizados en tareas de detección de objetos donde la precisión es crítica. Aunque existen modelos más rápidos como YOLO y SSD, estos tienden a sacrificar precisión, particularmente en la detección de objetos pequeños. Por otro lado, modelos más recientes como RetinaNet buscan un equilibrio entre precisión y velocidad, pero Faster R-CNN sigue siendo la opción preferida cuando la prioridad es maximizar la exactitud de las detecciones.

Figura 3: Arquitectura de Faster R-CNN. Extraída de: https://towardsdatascience.com/faster-r-cnn-for-object-detection-a-technical-summary-474c5b857b46
Figura 3: Arquitectura de Faster R-CNN. Extraída de: https://towardsdatascience.com/faster-r-cnn-for-object-detection-a-technical-summary-474c5b857b46

Preparar el datset¶

En el formato actual del conjunto, es necesario generar una clase Dataset Customizada para el entrenamiento. Para eso podemos tomar como ejemplo este tutorial: https://pytorch.org/tutorials/intermediate/torchvision_tutorial.html

📝 Nota:

Para entrenar la red Faster R-CNN, es necesario que se cumplan algunos requerimientos, entre los cuales se encuentra una definición de la estructura del Dataset como la siguiente:

The only specificity that we require is that the dataset __getitem__ should return a tuple:
  • image: torchvision.tv_tensors.Image of shape [3, H, W], a pure tensor, or a PIL Image of size (H, W)
  • target: a dict containing the following fields
    • boxes, torchvision.tv_tensors.BoundingBoxes of shape [N, 4]: the coordinates of the N bounding boxes in [x0, y0, x1, y1] format, ranging from 0 to W and 0 to H
    • labels, integer torch.Tensor of shape [N]: the label for each bounding box. 0 represents always the background class.
    • image_id, int: an image identifier. It should be unique between all the images in the dataset, and is used during evaluation
    • area, float torch.Tensor of shape [N]: the area of the bounding box. This is used during evaluation with the COCO metric, to separate the metric scores between small, medium and large boxes.
    • iscrowd, uint8 torch.Tensor of shape [N]: instances with iscrowd=True will be ignored during evaluation.
    • (optionally) masks, torchvision.tv_tensors.Mask of shape [N, H, W]: the segmentation masks for each one of the objects

También destacar que el modelo considera a la clase 0 como el fondo, lo que implica consideración en las labels

Primero creamos el conjunto personalizado sin las transformaciones:

📝 Nota:

En la creación de este conjunto, hay varias matices, entre las cuales se encuentra el tratamiento de ejemplos que no tienen ninguna detección.

In [18]:
class PalmTreeDataset(Dataset):
    """Clase que crea el dataset partiendo de las imágenes en las carpetas data y labels"""
    def __init__(self, data_dir, labels_dir):
        self.data_dir = data_dir
        self.labels_dir = labels_dir
        self.imgs = list(sorted(os.listdir(data_dir)))
        self.labels = list(sorted(os.listdir(labels_dir)))

    def __len__(self):
        return len(self.imgs)

    def __getitem__(self, idx):
        # Cargar imagen
        img_path = os.path.join(self.data_dir, self.imgs[idx])
        label_path = os.path.join(self.labels_dir, self.labels[idx])
        img = read_image(img_path)

        # Parsear archivo XML (PascalVOC)
        tree = ET.parse(label_path)
        root = tree.getroot()
        boxes = []
        labels = []
        isCrowd = []
        for obj in root.findall("object"):
            label = obj.find("name").text
            if label in LABELS.keys():
                bbox = obj.find("bndbox")
                xmin = int(bbox.find("xmin").text)
                ymin = int(bbox.find("ymin").text)
                xmax = int(bbox.find("xmax").text)
                ymax = int(bbox.find("ymax").text)

                boxes.append([xmin, ymin, xmax, ymax])
                labels.append(LABELS[label])  # Hay solo una clase: Palmera
                isCrowd.append(0)  # Seteamos como falso.

        # Convertimos a tensores
        # Tratamiento de cuando no hay detecciones.
        if len(boxes) > 0:
            boxes = tv_tensors.BoundingBoxes(
                boxes, format="XYXY", canvas_size=F.get_image_size(img)
            )
            area = (boxes[:, 3] - boxes[:, 1]) * (boxes[:, 2] - boxes[:, 0])
        else:
            # No hay ningún objeto detectado.
            boxes = torch.zeros((0, 4), dtype=torch.float32)
            area = torch.tensor([], dtype=torch.float32)

        # Convertir el nombre del archivo a un número basado en los caracteres
        image_id = idx
        isCrowd = torch.as_tensor(isCrowd, dtype=torch.int64)
        labels = torch.as_tensor(labels, dtype=torch.int64)
        img = tv_tensors.Image(img)

        target = {}
        target["boxes"] = boxes
        target["labels"] = labels
        target["image_id"] = image_id
        target["area"] = area
        target["iscrowd"] = isCrowd
        target["img_path"] = img_path

        return img, target

Definir las transformaciones¶

Si utilizaramos Albumentations, un posible pipeline de transofrmaciones sería:

# Utilizando Albumentations
def get_transforms(train=True):
    transforms = []
    if train:
        transforms.append(A.ColorJitter(brightness=0.2, contrast=0.2, saturation=0.2, hue=0.1, p=0.5)) # Ajuste HSV
        transforms.append(A.GaussianBlur(blur_limit=3, sigma_limit=(0.1, 2.0), p=0.5))  # Desenfoque gaussiano
        transforms.append(A.Rotate(limit=10, p=0.5))
        transforms.append(A.HorizontalFlip(p=0.5))
        transforms.append(A.VerticalFlip(p=0.5))
        transforms.append(A.Affine(scale=(0.8, 1.2), translate_percent=(0.1, 0.1), p=0.5))
    transforms.append(A.LongestMaxSize(max_size=IMG_SIZE))
    transforms.append(ToTensorV2())
    return A.Compose(transforms, bbox_params=A.BboxParams(format='pascal_voc', label_fields=['labels']))

class TransformedPalmTreeDatasetAlbumentations(Dataset):
    """Clase de más alto rango que aplica las transformaciones al PalmTreeDataset"""
    def __init__(self, dataset, transforms=None):
        self.dataset = dataset
        self.transforms = transforms

    def __len__(self):
        return len(self.dataset)

    def __getitem__(self, idx):
        img, target = self.dataset[idx]
        
        # Convertir la imagen a formato NumPy para que funcione con Albumentations
        img = img.permute(1, 2, 0).numpy()  # Cambiar de CxHxW a HxWxC

        boxes = target['boxes'].tolist()
        labels = target['labels'].tolist()
        
        if self.transforms is not None:
            # Aplicar transformaciones
            transformed = self.transforms(image=img, bboxes=boxes, labels=labels)
            img = transformed['image']
            boxes = transformed['bboxes']
            labels = transformed['labels']

        # Reconstruir los bounding boxes como un tensor
        target['boxes'] = torch.tensor(boxes, dtype=torch.float32)
        target['labels'] = torch.tensor(labels, dtype=torch.int64)
        
        return img, target

Sin embargo, las transformaciones, si bien mejoran los resultados, como mostrado en el clásico paper de Google, también aumentan el procesamiento. En este punto, se definen estas simples transofrmaciones:

In [19]:
def get_transforms(train):
    transforms = []
    if train:
        transforms.append(T.ColorJitter(brightness=0.2, contrast=0.2, saturation=0.2, hue=0.1))
        transforms.append(T.GaussianBlur(kernel_size=3, sigma=(0.1, 2.0)))
    transforms.append(T.Resize(size=IMG_SIZE))
    transforms.append(T.ToDtype(torch.float, scale=True))
    transforms.append(T.ToPureTensor())
    return T.Compose(transforms)

Creamos un conjunto de más alto rango que aplique las transformaciones. Estas transformaciones se aplican de este modo para separar con anterioridad al conjunto de train/test.

In [20]:
class TransformedPalmTreeDataset(Dataset):
    """Clase de más alto rango que aplica las transofrmaciones al PalmsDataset"""
    def __init__(self, dataset, transforms=None):
        self.dataset = dataset
        self.transforms = transforms

    def __len__(self):
        return len(self.dataset)

    def __getitem__(self, idx):
        img, target = self.dataset[idx]
        if self.transforms is not None:
            img, target = self.transforms(img, target)
        return img, target

Creamos los datasets:

📝 Nota:

En este punto se podría dividir el conjuto en train/test/validation de la siguiente forma:

# Creamos el dataset
full_dataset = PalmTreeDataset(RUTA_CARPETA_KAGGLE_DATASET_VOC_DATA, RUTA_CARPETA_KAGGLE_DATASET_VOC_LABELS)

# Lo dividimos en train/test
train_dataset, test_dataset = torch.utils.data.random_split(full_dataset, TRAIN_TEST_SPLIT_RATIO, torch.Generator().manual_seed(RANDOM_SEED))

Sin embargo, las imágnes ya están divididas, y para manetener la coherencia entre los modelos, se mantiene esta estrategia.

In [21]:
train_dataset_raw = PalmTreeDataset(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_DATA, RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_LABELS)
test_dataset_raw = PalmTreeDataset(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA, RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS)

# Aplicamos las transformaciones.
train_dataset = TransformedPalmTreeDataset(train_dataset_raw, transforms=get_transforms(True))
test_dataset = TransformedPalmTreeDataset(test_dataset_raw, transforms=get_transforms(False))

Visualizamos las transformaciones para una imagen:

In [22]:
random_idx = torch.randint(0, len(train_dataset), (1,)).item()
image, target = train_dataset[random_idx]

output_image = draw_bounding_boxes(image, target['boxes'], colors='red', width=2)

plt.figure()
plt.imshow(output_image.permute(1, 2, 0))
Out[22]:
<matplotlib.image.AxesImage at 0x7e58404bc970>
No description has been provided for this image

O para una grilla de imágenes:

In [23]:
config = {
    "linewidth": 2,
    "edgecolor": "r",
    "facecolor": "none",
    "color": "white",
    "fontsize": 6,
    "backgroundcolor": "red",
    "putText": False,
    "show_image_name": True,
    "name_fontsize": 10,
    "super_title": "Visualización de data augmentation",
    "super_title_fontsize": 16,
    "super_title_y": 0.95,
    "from_dataset_voc": True,
}

# Generar 9 índices aleatorios del dataset
random_indices = random.sample(range(len(train_dataset)), 9)

draw_grid(
    [train_dataset[i][0] for i in random_indices],
    [train_dataset[i][1] for i in random_indices],
    config,
)
No description has been provided for this image

Definición del modelo¶

Utilizaremos el modelo pre-entrando en el dataset de COCO (https://pytorch.org/vision/stable/models/generated/torchvision.models.detection.fasterrcnn_resnet50_fpn_v2.html#torchvision.models.detection.fasterrcnn_resnet50_fpn_v2). Primero, observamos como está compuesto el modelo para cambiar las capas correspondientes para nuestro caso:

In [24]:
preview_model = torchvision.models.detection.fasterrcnn_resnet50_fpn_v2(weights=None)
print(preview_model)
FasterRCNN(
  (transform): GeneralizedRCNNTransform(
      Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
      Resize(min_size=(800,), max_size=1333, mode='bilinear')
  )
  (backbone): BackboneWithFPN(
    (body): IntermediateLayerGetter(
      (conv1): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (maxpool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
      (layer1): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
            (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
      (layer2): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
            (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (3): Bottleneck(
          (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
      (layer3): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False)
            (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (3): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (4): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (5): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
      (layer4): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False)
            (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
    )
    (fpn): FeaturePyramidNetwork(
      (inner_blocks): ModuleList(
        (0): Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (1): Conv2dNormActivation(
          (0): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (2): Conv2dNormActivation(
          (0): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (3): Conv2dNormActivation(
          (0): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (layer_blocks): ModuleList(
        (0-3): 4 x Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (extra_blocks): LastLevelMaxPool()
    )
  )
  (rpn): RegionProposalNetwork(
    (anchor_generator): AnchorGenerator()
    (head): RPNHead(
      (conv): Sequential(
        (0): Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
          (1): ReLU(inplace=True)
        )
        (1): Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
          (1): ReLU(inplace=True)
        )
      )
      (cls_logits): Conv2d(256, 3, kernel_size=(1, 1), stride=(1, 1))
      (bbox_pred): Conv2d(256, 12, kernel_size=(1, 1), stride=(1, 1))
    )
  )
  (roi_heads): RoIHeads(
    (box_roi_pool): MultiScaleRoIAlign(featmap_names=['0', '1', '2', '3'], output_size=(7, 7), sampling_ratio=2)
    (box_head): FastRCNNConvFCHead(
      (0): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (1): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (2): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (3): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (4): Flatten(start_dim=1, end_dim=-1)
      (5): Linear(in_features=12544, out_features=1024, bias=True)
      (6): ReLU(inplace=True)
    )
    (box_predictor): FastRCNNPredictor(
      (cls_score): Linear(in_features=1024, out_features=91, bias=True)
      (bbox_pred): Linear(in_features=1024, out_features=364, bias=True)
    )
  )
)

Como haermos fine-tuning de este modelo, lo único que debemos cambiar es las últmas capas:

(box_predictor): FastRCNNPredictor(
      (cls_score): Linear(in_features=1024, out_features=91, bias=True)
      (bbox_pred): Linear(in_features=1024, out_features=364, bias=True)
    )

Definimos una función especial para esto, en donde se le pasa el número de clases que debe predecir. También permite obtener el modelo desde el útimo checkpoint de entrenamiento:

In [25]:
def get_fasterrcnn_resnet50_fpn_v2_model(num_classes: int, checkpoint_path: str = None, device: str = 'cpu') -> torch.nn.Module:
    # Cargar un modelo preentrenado de Faster R-CNN (los pesos de coco)
    model = torchvision.models.detection.fasterrcnn_resnet50_fpn_v2(weights='DEFAULT')
    
    # Obtener el número de entradas de la capa de clasificación
    in_features = model.roi_heads.box_predictor.cls_score.in_features
    
    # Reemplazar la cabeza del clasificador por una que tenga el número de clases deseado
    model.roi_heads.box_predictor = torchvision.models.detection.faster_rcnn.FastRCNNPredictor(in_features, num_classes + 1)

    # Optimizer
    params = [p for p in model.parameters() if p.requires_grad]
    optimizer = optim.SGD(params, lr=0.005, momentum=0.9, weight_decay=0.0005)

    # Aprender tasa de aprendizaje (LR) programada
    lr_scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=3, gamma=0.1)

    # Cargar los pesos del modelo desde un checkpoint si fue especificado.
    model.to(device)
    data = {}
    if checkpoint_path and os.path.exists(checkpoint_path):
        print(f"Cargando el modelo desde el checkpoint: {checkpoint_path}")
        
        # Cargar los parámetros del checkpoint
        checkpoint = torch.load(checkpoint_path, map_location=device)

        # Restaurar el estado del modelo
        model.load_state_dict(checkpoint['model_state_dict'])
        optimizer.load_state_dict(checkpoint['optimizer_state_dict'])
        lr_scheduler.load_state_dict(checkpoint['scheduler_state_dict'])
        data = checkpoint['data']
        print(f"\nModelo cargado desde el checkpoint en la época {data['epoch']}.\n")
    else:
        print("\nCargando el modelo desde 0.\n")
        
    return model, optimizer, lr_scheduler, data

Finalmente, instanciamos el modelo y vemos si efectivamente se cambiaron las últimas capas:

In [26]:
# Instanciamos el modelo.
save_checkpoint_path = os.path.join(RUTA_CARPETA_MODELS, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID + '_checkpoint.tar')
if LOAD_MODEL_FROM_CHECKPOINT and os.path.exists(save_checkpoint_path):
    model, optimizer, lr_scheduler, checkpoint_data = get_fasterrcnn_resnet50_fpn_v2_model(len(LABELS.keys()), save_checkpoint_path, DEVICE)
else:
    model, optimizer, lr_scheduler, checkpoint_data = get_fasterrcnn_resnet50_fpn_v2_model(len(LABELS.keys()), None, DEVICE)
print(model)
Downloading: "https://download.pytorch.org/models/fasterrcnn_resnet50_fpn_v2_coco-dd69338a.pth" to /root/.cache/torch/hub/checkpoints/fasterrcnn_resnet50_fpn_v2_coco-dd69338a.pth
100%|██████████| 167M/167M [00:00<00:00, 194MB/s]  
Cargando el modelo desde 0.

FasterRCNN(
  (transform): GeneralizedRCNNTransform(
      Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
      Resize(min_size=(800,), max_size=1333, mode='bilinear')
  )
  (backbone): BackboneWithFPN(
    (body): IntermediateLayerGetter(
      (conv1): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (maxpool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
      (layer1): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
            (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
      (layer2): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
            (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (3): Bottleneck(
          (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
      (layer3): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False)
            (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (3): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (4): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (5): Bottleneck(
          (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
      (layer4): Sequential(
        (0): Bottleneck(
          (conv1): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
          (downsample): Sequential(
            (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False)
            (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          )
        )
        (1): Bottleneck(
          (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (2): Bottleneck(
          (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
      )
    )
    (fpn): FeaturePyramidNetwork(
      (inner_blocks): ModuleList(
        (0): Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (1): Conv2dNormActivation(
          (0): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (2): Conv2dNormActivation(
          (0): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (3): Conv2dNormActivation(
          (0): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (layer_blocks): ModuleList(
        (0-3): 4 x Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
      )
      (extra_blocks): LastLevelMaxPool()
    )
  )
  (rpn): RegionProposalNetwork(
    (anchor_generator): AnchorGenerator()
    (head): RPNHead(
      (conv): Sequential(
        (0): Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
          (1): ReLU(inplace=True)
        )
        (1): Conv2dNormActivation(
          (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
          (1): ReLU(inplace=True)
        )
      )
      (cls_logits): Conv2d(256, 3, kernel_size=(1, 1), stride=(1, 1))
      (bbox_pred): Conv2d(256, 12, kernel_size=(1, 1), stride=(1, 1))
    )
  )
  (roi_heads): RoIHeads(
    (box_roi_pool): MultiScaleRoIAlign(featmap_names=['0', '1', '2', '3'], output_size=(7, 7), sampling_ratio=2)
    (box_head): FastRCNNConvFCHead(
      (0): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (1): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (2): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (3): Conv2dNormActivation(
        (0): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (2): ReLU(inplace=True)
      )
      (4): Flatten(start_dim=1, end_dim=-1)
      (5): Linear(in_features=12544, out_features=1024, bias=True)
      (6): ReLU(inplace=True)
    )
    (box_predictor): FastRCNNPredictor(
      (cls_score): Linear(in_features=1024, out_features=2, bias=True)
      (bbox_pred): Linear(in_features=1024, out_features=8, bias=True)
    )
  )
)

Podemos observar un resumen del mismo mediante torchinfo:

In [27]:
img, _ = test_dataset[0]
input_size = (1, *img.shape)
summary(model, input_size=input_size)
Out[27]:
====================================================================================================
Layer (type:depth-idx)                             Output Shape              Param #
====================================================================================================
FasterRCNN                                         [100, 4]                  --
├─GeneralizedRCNNTransform: 1-1                    [1, 3, 800, 1088]         --
├─BackboneWithFPN: 1-2                             [1, 256, 13, 17]          --
│    └─IntermediateLayerGetter: 2-1                [1, 2048, 25, 34]         --
│    │    └─Conv2d: 3-1                            [1, 64, 400, 544]         (9,408)
│    │    └─BatchNorm2d: 3-2                       [1, 64, 400, 544]         (128)
│    │    └─ReLU: 3-3                              [1, 64, 400, 544]         --
│    │    └─MaxPool2d: 3-4                         [1, 64, 200, 272]         --
│    │    └─Sequential: 3-5                        [1, 256, 200, 272]        (215,808)
│    │    └─Sequential: 3-6                        [1, 512, 100, 136]        1,219,584
│    │    └─Sequential: 3-7                        [1, 1024, 50, 68]         7,098,368
│    │    └─Sequential: 3-8                        [1, 2048, 25, 34]         14,964,736
│    └─FeaturePyramidNetwork: 2-2                  [1, 256, 13, 17]          --
│    │    └─ModuleList: 3-15                       --                        (recursive)
│    │    └─ModuleList: 3-16                       --                        (recursive)
│    │    └─ModuleList: 3-15                       --                        (recursive)
│    │    └─ModuleList: 3-16                       --                        (recursive)
│    │    └─ModuleList: 3-15                       --                        (recursive)
│    │    └─ModuleList: 3-16                       --                        (recursive)
│    │    └─ModuleList: 3-15                       --                        (recursive)
│    │    └─ModuleList: 3-16                       --                        (recursive)
│    │    └─LastLevelMaxPool: 3-17                 [1, 256, 200, 272]        --
├─RegionProposalNetwork: 1-3                       [964, 4]                  --
│    └─RPNHead: 2-3                                [1, 3, 200, 272]          --
│    │    └─Sequential: 3-18                       [1, 256, 200, 272]        1,180,160
│    │    └─Conv2d: 3-19                           [1, 3, 200, 272]          771
│    │    └─Conv2d: 3-20                           [1, 12, 200, 272]         3,084
│    │    └─Sequential: 3-21                       [1, 256, 100, 136]        (recursive)
│    │    └─Conv2d: 3-22                           [1, 3, 100, 136]          (recursive)
│    │    └─Conv2d: 3-23                           [1, 12, 100, 136]         (recursive)
│    │    └─Sequential: 3-24                       [1, 256, 50, 68]          (recursive)
│    │    └─Conv2d: 3-25                           [1, 3, 50, 68]            (recursive)
│    │    └─Conv2d: 3-26                           [1, 12, 50, 68]           (recursive)
│    │    └─Sequential: 3-27                       [1, 256, 25, 34]          (recursive)
│    │    └─Conv2d: 3-28                           [1, 3, 25, 34]            (recursive)
│    │    └─Conv2d: 3-29                           [1, 12, 25, 34]           (recursive)
│    │    └─Sequential: 3-30                       [1, 256, 13, 17]          (recursive)
│    │    └─Conv2d: 3-31                           [1, 3, 13, 17]            (recursive)
│    │    └─Conv2d: 3-32                           [1, 12, 13, 17]           (recursive)
│    └─AnchorGenerator: 2-4                        [217413, 4]               --
├─RoIHeads: 1-4                                    [100, 4]                  --
│    └─MultiScaleRoIAlign: 2-5                     [964, 256, 7, 7]          --
│    └─FastRCNNConvFCHead: 2-6                     [964, 1024]               --
│    │    └─Conv2dNormActivation: 3-33             [964, 256, 7, 7]          590,336
│    │    └─Conv2dNormActivation: 3-34             [964, 256, 7, 7]          590,336
│    │    └─Conv2dNormActivation: 3-35             [964, 256, 7, 7]          590,336
│    │    └─Conv2dNormActivation: 3-36             [964, 256, 7, 7]          590,336
│    │    └─Flatten: 3-37                          [964, 12544]              --
│    │    └─Linear: 3-38                           [964, 1024]               12,846,080
│    │    └─ReLU: 3-39                             [964, 1024]               --
│    └─FastRCNNPredictor: 2-7                      [964, 2]                  --
│    │    └─Linear: 3-40                           [964, 2]                  2,050
│    │    └─Linear: 3-41                           [964, 8]                  8,200
====================================================================================================
Total params: 43,256,153
Trainable params: 43,030,809
Non-trainable params: 225,344
Total mult-adds (G): 329.84
====================================================================================================
Input size (MB): 1.44
Forward/backward pass size (MB): 4764.00
Params size (MB): 173.02
Estimated Total Size (MB): 4938.46
====================================================================================================
In [28]:
# Limpiamos el chache.
clean_gpu_usage()
Initial GPU Usage
| ID | GPU | MEM |
------------------
|  0 | 22% | 10% |
GPU Usage after emptying the cache
| ID | GPU | MEM |
------------------
|  0 | 10% |  3% |

Para comprobar el funcionamiento del modelo, podemos realizar un test de la inferencia y el entrenamiento:

In [29]:
data_loader = DataLoader(
    train_dataset, batch_size=2, shuffle=True, collate_fn=lambda x: tuple(zip(*x))
)

model.to(DEVICE)
# Training
images, targets = next(iter(data_loader))
# Es necesario para que no compute los gradientes, sino queda "basura" en la GPU.
with torch.no_grad():
    images = list(image.to(DEVICE) for image in images)
    targets = [
        {k: (v.to(DEVICE) if isinstance(v, torch.Tensor) else v) for k, v in t.items()}
        for t in targets
    ]
    output = model(images, targets)  # Devuelve losses y detections
    print(f"Training test: {output}")

# Inferencia
model.eval()
with torch.no_grad():
    x = [torch.rand(3, 300, 400).to(DEVICE), torch.rand(3, 500, 400).to(DEVICE)]
    predictions = model(x)  # Returns predictions
    print(f"Inference test: {predictions}")
Training test: {'loss_classifier': tensor(0.6804, device='cuda:0'), 'loss_box_reg': tensor(0.6662, device='cuda:0'), 'loss_objectness': tensor(2.6651, device='cuda:0'), 'loss_rpn_box_reg': tensor(0.1652, device='cuda:0')}
Inference test: [{'boxes': tensor([[1.1838e+00, 6.9196e+01, 2.3142e+01, 9.1906e+01],
        [2.0914e+00, 1.9127e+00, 2.3696e+01, 2.3896e+01],
        [1.1071e+00, 5.9000e+00, 2.9532e+01, 1.1002e+02],
        [7.8116e-01, 3.8419e+01, 4.8221e+00, 2.1711e+02],
        [2.0872e+00, 2.1879e+02, 1.7378e+01, 2.8514e+02],
        [3.9836e-01, 1.3803e+01, 6.1088e+00, 1.0794e+02],
        [4.2202e+00, 2.0602e+01, 7.8354e+01, 2.2338e+02],
        [5.0875e-01, 9.7071e+01, 3.7838e+00, 3.0000e+02],
        [3.3040e-01, 1.6243e+00, 2.4543e+00, 7.9984e+00],
        [2.8347e+00, 2.3997e+02, 2.1734e+01, 2.9073e+02],
        [1.0745e+01, 9.0602e+00, 1.0062e+02, 9.7987e+01],
        [2.0819e+00, 2.2835e+01, 1.7662e+01, 9.3627e+01],
        [1.3934e+00, 1.2516e+01, 4.2324e+01, 2.0761e+02],
        [3.9529e+00, 1.9021e+02, 5.0760e+01, 2.4612e+02],
        [2.7400e-01, 1.9149e+02, 1.3847e+01, 2.2384e+02],
        [1.9286e+02, 1.3049e+02, 2.1644e+02, 1.5595e+02],
        [8.5947e-02, 9.5661e+01, 1.2750e+00, 2.1012e+02],
        [1.2327e+00, 9.4787e+00, 1.2566e+01, 2.4390e+02],
        [3.7159e+00, 1.2878e+02, 3.7331e+01, 2.5260e+02],
        [2.1700e+00, 1.6617e+02, 5.0782e+01, 2.2331e+02],
        [3.5076e-02, 1.9495e+02, 1.2038e+00, 2.7788e+02],
        [6.4926e+00, 3.0107e+00, 5.1604e+01, 2.3700e+01],
        [4.0101e+00, 3.7491e+00, 5.4818e+01, 5.5457e+01],
        [1.4809e+00, 1.0776e+01, 1.7659e+01, 4.7967e+01],
        [7.2419e-02, 1.4596e+02, 1.0681e+00, 2.6651e+02],
        [2.6972e-01, 3.9373e+00, 2.5625e+00, 1.2599e+01],
        [4.6435e-01, 1.3784e+01, 1.6058e+00, 1.1518e+02],
        [2.9308e-01, 6.4677e+01, 1.7640e+00, 1.7120e+02],
        [3.5298e+00, 9.0067e+01, 3.5453e+01, 2.0475e+02],
        [5.6804e-02, 2.7477e+01, 1.0741e+00, 1.2423e+02],
        [3.2147e+02, 1.7071e+02, 3.4544e+02, 1.9476e+02],
        [3.6411e+00, 7.3655e+00, 3.6200e+01, 7.0725e+01],
        [2.5458e+00, 1.9716e+02, 1.7516e+01, 2.5536e+02],
        [5.4481e+00, 6.9872e+01, 5.1643e+01, 1.2042e+02],
        [1.0716e+02, 1.0917e+02, 1.1169e+02, 1.1361e+02],
        [2.1038e+00, 1.9312e+01, 1.5508e+01, 1.6179e+02],
        [2.9056e+00, 3.5271e+01, 3.2552e+01, 1.5827e+02],
        [6.4408e+00, 1.6875e+00, 9.1419e+01, 1.9765e+01],
        [1.5883e+00, 2.7284e+02, 2.5581e+01, 2.9790e+02],
        [2.9982e+00, 9.5912e+01, 1.5944e+01, 1.6825e+02],
        [6.3398e+00, 1.0082e+00, 6.1967e+01, 9.3382e+00],
        [3.8002e+00, 5.1851e-01, 1.0214e+02, 1.0432e+01],
        [4.3802e-01, 1.8655e+00, 3.5049e+00, 1.1225e+01],
        [1.0822e+02, 1.1025e+02, 1.1236e+02, 1.1483e+02],
        [1.7376e+02, 1.1541e+02, 2.0003e+02, 1.4017e+02],
        [1.0701e+01, 6.7086e+00, 1.3412e+02, 1.3945e+02],
        [1.4562e+00, 4.4613e-01, 9.1344e+00, 3.0569e+00],
        [3.8447e+00, 1.6696e+02, 1.5122e+01, 2.4004e+02],
        [9.9593e+00, 1.3868e+02, 9.0940e+01, 2.3077e+02],
        [2.7560e-02, 5.4459e+00, 1.1902e+00, 8.7901e+01],
        [9.4716e+01, 7.0194e-01, 1.6560e+02, 1.6746e+00],
        [3.8166e+00, 2.2367e+02, 3.7211e+01, 2.9039e+02],
        [1.3099e-02, 4.9762e+01, 1.1817e+00, 1.7369e+02],
        [4.1312e+00, 6.8820e+01, 1.5107e+01, 1.4141e+02],
        [3.1483e+02, 1.7319e+02, 3.3661e+02, 1.9594e+02],
        [6.2488e-01, 7.1906e-01, 6.6207e+00, 3.5269e+00],
        [3.7218e+00, 1.4291e+02, 5.3254e+01, 1.9882e+02],
        [3.3209e+00, 1.3486e+02, 1.5242e+01, 2.1622e+02],
        [1.0595e+01, 4.6290e+00, 1.0690e+02, 3.7243e+01],
        [5.3710e+00, 1.6236e+00, 3.8800e+01, 1.8306e+01],
        [3.4794e+00, 5.5858e+01, 2.0158e+01, 1.1845e+02],
        [3.1076e-01, 4.3464e+00, 1.8500e+00, 6.1777e+01],
        [1.6865e+02, 1.2305e+02, 1.7173e+02, 1.2590e+02],
        [1.7586e-01, 1.8609e+02, 4.8988e+00, 3.0000e+02],
        [1.4805e+00, 1.3130e+02, 1.3113e+01, 2.8720e+02],
        [3.3126e+02, 1.1172e+02, 3.3665e+02, 1.1522e+02],
        [1.1555e-01, 6.8069e+00, 2.1260e+00, 2.4083e+01],
        [3.6034e+02, 1.8933e+00, 4.0000e+02, 3.4411e+01],
        [1.3855e+01, 7.7996e+01, 1.8602e+01, 8.1664e+01],
        [2.1527e+02, 1.0915e-01, 2.9274e+02, 1.2658e+00],
        [1.6683e+00, 4.0281e-01, 8.5713e+01, 5.6490e+00],
        [3.1014e+02, 2.1185e+00, 4.0000e+02, 1.8870e+01],
        [3.9484e+01, 1.0288e+00, 2.1749e+02, 4.2918e+00],
        [1.7647e+02, 1.2735e+02, 1.8039e+02, 1.3060e+02],
        [1.6185e+02, 1.7766e+02, 1.6584e+02, 1.8220e+02],
        [1.4054e+01, 8.5308e+01, 1.5412e+02, 2.1644e+02],
        [2.0078e-01, 2.3431e+02, 1.7856e+00, 3.0000e+02],
        [3.3376e+02, 9.4217e+01, 3.5795e+02, 1.1706e+02],
        [3.4937e+02, 1.5238e-01, 4.0000e+02, 1.8748e+00],
        [2.5362e+02, 1.8775e-01, 3.2085e+02, 1.2539e+00],
        [1.4355e+02, 2.2069e+02, 1.4756e+02, 2.2423e+02],
        [1.9599e+02, 1.3411e+02, 2.2511e+02, 1.6347e+02],
        [5.7509e-01, 2.4297e-01, 3.1842e+00, 4.9793e+00],
        [1.8273e+02, 1.3258e+02, 2.1365e+02, 1.6364e+02],
        [1.5874e+01, 7.4106e+01, 2.0625e+01, 7.7199e+01],
        [1.7554e+02, 1.2678e+02, 1.7852e+02, 1.3089e+02],
        [2.8416e+02, 1.9800e+02, 3.2428e+02, 2.1737e+02],
        [2.9554e+00, 2.7059e-01, 5.3310e+01, 2.2339e+00],
        [3.2463e+02, 1.2157e+00, 4.0000e+02, 5.4937e+00],
        [3.0187e+00, 2.3788e-01, 1.1854e+01, 2.3808e+00],
        [3.8098e+02, 9.1978e+00, 4.0000e+02, 5.8569e+01],
        [1.9829e+02, 1.5134e+01, 2.9126e+02, 1.0151e+02],
        [1.6367e+02, 1.1864e+02, 1.9315e+02, 1.4537e+02],
        [1.7011e+02, 1.2165e+02, 1.7372e+02, 1.2477e+02],
        [1.7569e+02, 1.2421e+02, 1.7872e+02, 1.2812e+02],
        [1.3117e+01, 7.6318e+01, 1.8046e+01, 7.9700e+01],
        [5.4787e-02, 3.8313e-01, 1.5586e+00, 6.9443e+00],
        [6.1164e-02, 2.2691e+00, 1.6730e+00, 1.0278e+01],
        [1.7136e+02, 1.3245e+02, 2.4726e+02, 1.6797e+02],
        [1.5230e-01, 4.4514e+00, 3.0023e+00, 2.0281e+01]], device='cuda:0'), 'labels': tensor([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1],
       device='cuda:0'), 'scores': tensor([0.6048, 0.5852, 0.5780, 0.5773, 0.5673, 0.5620, 0.5609, 0.5598, 0.5587, 0.5543, 0.5524, 0.5509, 0.5493, 0.5489, 0.5469, 0.5449, 0.5442, 0.5437, 0.5416, 0.5406, 0.5401, 0.5397, 0.5394, 0.5392, 0.5388, 0.5374, 0.5374, 0.5361, 0.5357, 0.5338, 0.5332, 0.5319, 0.5318, 0.5315, 0.5304, 0.5276, 0.5267, 0.5261, 0.5255,
        0.5249, 0.5248, 0.5244, 0.5229, 0.5228, 0.5221, 0.5219, 0.5210, 0.5206, 0.5204, 0.5201, 0.5193, 0.5189, 0.5186, 0.5184, 0.5174, 0.5173, 0.5165, 0.5161, 0.5153, 0.5145, 0.5139, 0.5115, 0.5112, 0.5111, 0.5106, 0.5103, 0.5085, 0.5084, 0.5082, 0.5081, 0.5080, 0.5076, 0.5065, 0.5055, 0.5047, 0.5047, 0.5037, 0.5031,
        0.5024, 0.5012, 0.5010, 0.5007, 0.5000, 0.4995, 0.4993, 0.4988, 0.4982, 0.4979, 0.4965, 0.4957, 0.4946, 0.4935, 0.4935, 0.4933, 0.4931, 0.4923, 0.4918, 0.4918, 0.4916, 0.4916], device='cuda:0')}, {'boxes': tensor([[2.9572e+00, 2.4565e+00, 3.2190e+01, 3.1069e+01],
        [2.0088e+02, 2.5354e+02, 2.0661e+02, 2.5716e+02],
        [3.0846e+00, 1.4146e+02, 3.2825e+01, 1.7229e+02],
        [2.1422e+00, 6.9173e+00, 4.1508e+01, 1.3565e+02],
        [5.9225e+00, 3.0755e+00, 6.5494e+01, 5.7274e+01],
        [4.8096e+00, 1.1340e+01, 7.2981e+01, 9.7655e+01],
        [1.5955e+01, 1.5767e+01, 1.3224e+02, 1.3304e+02],
        [6.7925e-01, 9.0284e+01, 5.0381e+00, 3.6284e+02],
        [8.4425e-01, 2.6454e+02, 1.7244e+01, 3.9061e+02],
        [6.6205e-01, 1.0590e+01, 6.7564e+00, 2.1159e+02],
        [2.1201e+01, 5.2711e+00, 2.2609e+02, 1.4988e+02],
        [6.2056e-01, 1.7410e+02, 4.7024e+00, 4.6401e+02],
        [7.9431e-01, 2.0826e+01, 8.7202e+00, 1.3981e+02],
        [2.0536e+00, 3.2215e-01, 7.1818e+01, 7.3487e+00],
        [6.7951e-01, 1.9672e+02, 2.1081e+01, 3.2557e+02],
        [1.5797e+00, 1.0635e+01, 1.5725e+01, 3.4195e+02],
        [3.2884e+00, 4.2028e+02, 2.4966e+01, 4.8669e+02],
        [1.9881e+02, 2.7372e+02, 2.0474e+02, 2.7812e+02],
        [9.4354e+00, 2.1675e+00, 8.3125e+01, 1.3218e+01],
        [1.4782e+02, 3.0753e+02, 1.5338e+02, 3.1153e+02],
        [2.0178e+02, 2.5219e+02, 2.0625e+02, 2.5536e+02],
        [6.7084e-01, 1.0757e+02, 2.0900e+00, 2.6625e+02],
        [2.7756e+00, 1.6120e+01, 3.0579e+01, 8.8541e+01],
        [3.2782e+00, 2.7495e+01, 2.2096e+01, 1.9295e+02],
        [4.8128e-02, 7.9474e+01, 1.5266e+00, 2.3062e+02],
        [4.5666e-01, 3.0428e+02, 1.7345e+00, 4.6057e+02],
        [5.7889e-02, 9.5392e+00, 5.0852e+00, 1.4950e+02],
        [5.5091e-01, 2.9246e+01, 1.8917e+00, 1.8952e+02],
        [1.4228e+00, 3.8099e+02, 8.1706e+00, 4.9336e+02],
        [1.2263e-01, 3.2505e+02, 1.3434e+00, 4.8270e+02],
        [2.7565e+01, 1.3733e+01, 1.7493e+02, 2.3225e+02],
        [7.0036e-01, 3.8301e+02, 2.4344e+00, 5.0000e+02],
        [1.7695e-01, 2.1097e+02, 1.4366e+00, 3.6848e+02],
        [6.9054e-02, 1.2689e+02, 1.4101e+00, 2.7683e+02],
        [8.1419e+00, 3.2950e+01, 1.0311e+02, 2.9950e+02],
        [4.8131e+00, 1.2541e+02, 2.2371e+01, 2.1387e+02],
        [4.5420e-01, 2.2778e+02, 1.8244e+00, 3.9729e+02],
        [3.6512e-01, 1.5571e+01, 2.0793e+00, 1.2108e+02],
        [1.1027e+01, 4.9755e+01, 1.2915e+02, 1.9833e+02],
        [1.4511e-01, 1.6314e+02, 1.5909e+00, 3.2323e+02],
        [1.9906e+02, 2.7244e+02, 2.0317e+02, 2.7682e+02],
        [6.5839e+00, 5.4642e+01, 6.7739e+01, 1.2650e+02],
        [1.9987e+02, 2.5426e+02, 2.0485e+02, 2.5779e+02],
        [2.9493e+02, 8.1122e+01, 3.4949e+02, 1.0579e+02],
        [5.5140e+00, 2.9824e+02, 1.9822e+01, 4.0786e+02],
        [3.7835e+00, 5.0734e+01, 4.8852e+01, 1.9669e+02],
        [9.4413e+00, 7.1423e-01, 2.0790e+02, 1.5162e+01],
        [1.6277e+00, 8.9502e+01, 3.4839e+01, 2.1772e+02],
        [5.1889e-02, 1.5333e+01, 1.4428e+00, 1.3498e+02],
        [2.0047e+02, 2.5229e+02, 2.0483e+02, 2.5551e+02],
        [2.3069e+02, 2.4127e+02, 2.3474e+02, 2.4578e+02],
        [1.6891e+02, 2.5262e+02, 2.0098e+02, 2.8441e+02],
        [1.4861e+02, 3.0908e+02, 1.5482e+02, 3.1305e+02],
        [2.3168e+02, 2.4066e+02, 2.3591e+02, 2.4487e+02],
        [1.4790e+02, 1.9031e+02, 2.0305e+02, 2.4863e+02],
        [1.1457e+02, 5.0807e+00, 2.2678e+02, 8.8072e+01],
        [7.7154e-02, 3.9603e+02, 1.5485e+00, 5.0000e+02],
        [4.5597e-02, 2.5482e+02, 1.0904e+00, 4.5117e+02],
        [2.3167e+02, 2.3934e+02, 2.3624e+02, 2.4327e+02],
        [7.2726e+00, 1.3105e+02, 6.7021e+01, 1.9353e+02],
        [1.9821e+02, 2.7403e+02, 2.0303e+02, 2.7945e+02],
        [1.7079e-01, 3.9485e+02, 5.0123e+00, 5.0000e+02],
        [8.5375e-01, 3.0523e+02, 6.2928e+00, 5.0000e+02],
        [1.0384e+00, 4.9475e+01, 2.6772e+01, 3.8799e+02],
        [2.2727e-02, 1.9030e+02, 1.0879e+00, 3.4134e+02],
        [8.5759e-01, 4.0329e+02, 1.6060e+01, 5.0000e+02],
        [1.9978e+02, 2.5134e+02, 2.0360e+02, 2.5513e+02],
        [5.4344e+00, 2.4743e+02, 2.1041e+01, 3.4845e+02],
        [8.2601e+00, 2.4818e+02, 7.1193e+01, 3.1724e+02],
        [1.0422e+01, 4.8586e+00, 7.2868e+01, 2.9846e+01],
        [1.2046e+01, 3.0324e+00, 1.2684e+02, 2.8320e+01],
        [1.9960e+02, 2.5301e+02, 2.0417e+02, 2.5659e+02],
        [3.2825e+00, 6.5435e+01, 6.2687e+01, 3.6329e+02],
        [5.1014e+00, 1.7162e+02, 2.1501e+01, 2.6657e+02],
        [2.3489e+01, 2.8921e+01, 7.3234e+01, 1.4216e+02],
        [1.0540e+02, 3.2902e+02, 1.6594e+02, 3.9463e+02],
        [6.5591e+00, 1.8629e+02, 4.7375e+01, 3.2890e+02],
        [1.4979e+02, 3.0992e+02, 1.5474e+02, 3.1525e+02],
        [8.7145e+00, 1.9857e+02, 7.2090e+01, 2.6936e+02],
        [4.3641e+00, 9.7006e-01, 1.6513e+02, 7.6226e+00],
        [1.9751e+02, 2.7181e+02, 2.0187e+02, 2.7538e+02],
        [1.9659e+02, 2.8176e+02, 2.0138e+02, 2.8552e+02],
        [1.9455e+02, 2.7586e+02, 1.9912e+02, 2.8029e+02],
        [4.7114e+00, 3.4956e+02, 2.2535e+01, 4.7276e+02],
        [1.3162e+01, 1.9603e+02, 1.2805e+02, 3.2259e+02],
        [1.5252e+02, 2.1008e+02, 1.8703e+02, 2.4498e+02],
        [1.1041e+02, 4.5710e-01, 3.1646e+02, 5.4713e+00],
        [1.5395e+02, 4.9900e-01, 4.0000e+02, 1.1191e+01],
        [1.2606e+00, 4.6167e+02, 2.4622e+01, 5.0000e+02],
        [3.1327e+02, 7.3127e-01, 4.0000e+02, 6.9836e+00],
        [1.2449e+02, 2.9084e+02, 1.4975e+02, 3.4894e+02],
        [2.0316e+02, 2.5234e+02, 2.0736e+02, 2.5603e+02],
        [6.2881e+00, 1.0501e+02, 4.6764e+01, 2.4712e+02],
        [1.9958e+02, 2.8225e+02, 2.0442e+02, 2.8616e+02],
        [1.9600e+02, 2.7538e+02, 2.0003e+02, 2.8076e+02],
        [7.0758e+00, 9.1582e+01, 6.5400e+01, 1.5080e+02],
        [8.9005e+01, 7.9374e+00, 3.9246e+02, 1.2294e+02],
        [3.3688e+02, 5.7885e+00, 4.0000e+02, 6.9993e+01],
        [3.7779e+02, 2.6538e+01, 3.9873e+02, 9.5263e+01],
        [2.6669e+02, 7.5127e+00, 3.9299e+02, 4.4415e+01]], device='cuda:0'), 'labels': tensor([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1],
       device='cuda:0'), 'scores': tensor([0.5589, 0.5493, 0.5484, 0.5391, 0.5372, 0.5351, 0.5336, 0.5296, 0.5286, 0.5280, 0.5243, 0.5218, 0.5189, 0.5188, 0.5187, 0.5178, 0.5151, 0.5146, 0.5140, 0.5139, 0.5138, 0.5137, 0.5133, 0.5129, 0.5116, 0.5110, 0.5096, 0.5095, 0.5090, 0.5077, 0.5076, 0.5074, 0.5071, 0.5069, 0.5060, 0.5049, 0.5035, 0.5032, 0.5031,
        0.5030, 0.5027, 0.5026, 0.5025, 0.5012, 0.5010, 0.5006, 0.5000, 0.4998, 0.4997, 0.4976, 0.4975, 0.4971, 0.4963, 0.4958, 0.4957, 0.4948, 0.4945, 0.4942, 0.4942, 0.4932, 0.4930, 0.4929, 0.4923, 0.4920, 0.4916, 0.4912, 0.4907, 0.4906, 0.4899, 0.4898, 0.4890, 0.4890, 0.4888, 0.4874, 0.4873, 0.4870, 0.4868, 0.4863,
        0.4861, 0.4842, 0.4829, 0.4817, 0.4814, 0.4810, 0.4806, 0.4805, 0.4804, 0.4794, 0.4788, 0.4783, 0.4779, 0.4776, 0.4773, 0.4769, 0.4765, 0.4761, 0.4761, 0.4761, 0.4756, 0.4754], device='cuda:0')}]
In [30]:
# Limpiamos el chache.
clean_gpu_usage()
Initial GPU Usage
| ID | GPU | MEM |
------------------
|  0 | 93% | 25% |
GPU Usage after emptying the cache
| ID | GPU | MEM |
------------------
|  0 | 36% |  3% |

Entrenar el modelo¶

Clase que se utiliza para detener el entrenamiento si no mejora la métrica (pérdida en evaluación) durante N épocas.

In [31]:
class EarlyStopper:
    def __init__(self, patience: int = 1, min_delta: float = 0):
        self.patience = patience
        self.min_delta = min_delta
        self.counter = 0
        self.min_validation_loss = float("inf")
    
    def early_stop(self, validation_loss: float) -> bool:
        if validation_loss < self.min_validation_loss:
            self.min_validation_loss = validation_loss
            self.counter = 0
        elif validation_loss >= (self.min_validation_loss + self.min_delta):
            self.counter += 1
            if self.counter >= self.patience:
                return True
        return False

Wrapper sobre tran_one_epoch (permite realizar el entrenamiento en una época) de las utilidades de entrenamiento de pytorch/resources/vision/detection:

In [32]:
def coco_train(
    model: nn.Module,
    data_loader: DataLoader,
    device: str,
    optimizer: optim.Optimizer,    
    epoch: int = None,    
    criterion: nn.Module = None,
    verbose: bool = True,
) -> tuple[float, float]:

    model.to(device)
    train_loss = 0.0
    train_acc = 0.0

    metric_logger: MetricLogger = train_one_epoch(
        model, optimizer, data_loader, DEVICE, epoch, print_freq=10
    )
    train_loss = metric_logger.meters["loss"].global_avg  # Obtener la pérdida promedio
    return train_loss, train_acc

Wrapper sobre evaluate (permite evaluar utilizando el formato de COCO -se dividen entre los tamaños de los objetos-) de las utilidades de entrenamiento de pytorch/resources/vision/detection:

In [33]:
def coco_evaluate(
    model: nn.Module,
    data_loader: DataLoader,
    device: str,
    optimizer: optim.Optimizer = None,
    epoch: int = None,
    criterion: nn.Module = None,
    verbose: bool = True,    
) -> tuple[float, float]:
    
    model.to(device)
    test_loss = 0.0
    test_acc = 0.0
   
    coco_evaluator: CocoEvaluator = evaluate(model, data_loader, device)
    test_loss = coco_evaluator.coco_eval["bbox"].stats[0]
    test_acc = coco_evaluator.coco_eval["bbox"].stats[1]
    return test_loss, test_acc

Función que permite realizar el entrenamiento utilizando las funcionalidades de COCO. Utiliza EarlyStopping y también continúa el entrenamiento desde el útlimo checkpoint:

In [34]:
def fit(
    model: nn.Module,
    train_dataloader: DataLoader,
    test_dataloader: DataLoader,
    optimizer: optim.Optimizer,
    criterion: nn.Module,
    scheduler: optim.Optimizer = None,
    early_stopping: EarlyStopper = None,
    epochs: int = EPOCHS,
    verbose: bool = True,
    device: str = DEVICE,
    start_from_checkpoint_data: Dict = None,
    base_dir: str = None,
    model_id: str = None,
    save_checkpoint: bool = False,
    save_best_model: bool = False,
    save_results: bool = False
) -> Tuple[List[float], List[float], List[float], List[float]]:
    # Chequeos iniciales
    if (save_best_model or save_checkpoint or save_results) and not (base_dir and model_id):
        raise ValueError("base_dir y model_id no pueden ser vacíos si save_best_model o save_checkpoint son verdaderos.")
    
    if start_from_checkpoint_data:
        starting_epoch = start_from_checkpoint_data['epoch'] + 1
        if starting_epoch > epochs:
            raise IndexError(f"Mal inicio desde checkpoint: starting_epoch > epochs")
        print(f"Traning from checkpoint at epoch {starting_epoch}\n")
    else:
        starting_epoch = 1
        print("Training... \n")

    train_losses = [] if not start_from_checkpoint_data else start_from_checkpoint_data['train_losses']
    test_losses = [] if not start_from_checkpoint_data else start_from_checkpoint_data['test_losses']
    train_accs = [] if not start_from_checkpoint_data else start_from_checkpoint_data['train_accs']
    test_accs = [] if not start_from_checkpoint_data else start_from_checkpoint_data['test_accs']
    best_test_acc = -float('inf') if not start_from_checkpoint_data else start_from_checkpoint_data['"best_test_acc"']

    best_test_acc = -float('inf')

    pbar = trange(starting_epoch, epochs + 1, desc="Epoch", unit="epoch", colour="green")
    for epoch in pbar:
        train_loss, train_acc = coco_train(
            model=model,
            data_loader=train_dataloader,
            optimizer=optimizer,
            device=device,
            epoch=epoch,
            criterion=None,
            verbose=verbose
        )
        test_loss, test_acc = coco_evaluate(model=model,
                                     data_loader=test_dataloader,
                                     device=device,
                                     verbose=verbose)

        if scheduler: scheduler.step()

        train_losses.append(train_loss)
        test_losses.append(test_loss)
        train_accs.append(train_acc)
        test_accs.append(test_acc)

        pbar.set_postfix({
            "Train Loss": format(train_loss, ".3f"),
            "Test Loss": format(test_loss, ".3f"),
            "Train Accuracy": format(train_acc, ".3f"),
            "Test Accuracy": format(test_acc, ".3f"),
        })

        # Checkpoint
        if save_checkpoint:
            file_name_path = os.path.join(base_dir, model_id + '_checkpoint.tar')
            os.makedirs(base_dir, exist_ok=True)
            torch.save(
                {
                    "model_state_dict": model.state_dict(),
                    "optimizer_state_dict": optimizer.state_dict(),
                    "scheduler_state_dict": scheduler.state_dict(),
                    "data": {
                        "epoch": epoch,
                        "train_losses": train_losses,
                        "test_losses": test_losses,
                        "train_accs": train_accs,
                        "test_accs": test_accs,
                        "best_test_acc": best_test_acc
                    },
                },
                file_name_path,
            )

        # Save best model
        if save_best_model and test_acc > best_test_acc:
            best_test_acc = test_acc
            file_name_path = os.path.join(base_dir, model_id + '_best_model.pt')
            os.makedirs(base_dir, exist_ok=True)
            torch.save(model.state_dict(), file_name_path)
            if verbose:
                print(f"Best model saved at epoch {epoch}!")

        # Early stopping
        if early_stopping and early_stopping.early_stop(test_loss):
            print("\nEarly stopping activated.\n")
            break
    
    # Save results in JSON
    if save_results:
        results = {
            "train_losses": train_losses,
            "test_losses": test_losses,
            "train_accs": train_accs,
            "test_accs": test_accs,
            "best_test_acc": best_test_acc
        }
        results_file_path = os.path.join(base_dir, model_id + '_results.json')
        with open(results_file_path, 'w') as json_file:
            json.dump(results, json_file, indent=4)
        if verbose:
            print(f"Results saved to {results_file_path}!")

    print("\n \nFinished Training")
    return train_losses, test_losses, train_accs, test_accs

Una vez que tenemos todo configurado, realizamos el entrenamiento:

In [35]:
EPOCHS = 20
BATCH_SIZE = 4

train_dataloader = DataLoader(train_dataset, batch_size=BATCH_SIZE, shuffle=True, collate_fn=lambda x: tuple(zip(*x)))
test_dataloader = DataLoader(test_dataset, batch_size=BATCH_SIZE, shuffle=False, collate_fn=lambda x: tuple(zip(*x)))

train_losses, test_losses, train_accs, test_accs = fit(
    model=model,
    train_dataloader=train_dataloader,
    test_dataloader=test_dataloader,
    optimizer=optimizer,
    criterion=None,
    scheduler=lr_scheduler,
    early_stopping=EarlyStopper(5, 0),
    epochs=EPOCHS,
    verbose=False,
    device=DEVICE,
    start_from_checkpoint_data=checkpoint_data,
    base_dir=os.path.join(
        RUTA_CARPETA_MODELS,
        FASTERRCNN_RESNET50_FPN_V2_MODEL_ID),
    model_id=FASTERRCNN_RESNET50_FPN_V2_MODEL_ID,
    save_checkpoint=LOAD_MODEL_FROM_CHECKPOINT,
    save_best_model = True,
    save_results = True
)
Training... 

Epoch:   0%|          | 0/20 [00:00<?, ?epoch/s]/kaggle/working/resources/utils/engine.py:30: FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.
  with torch.cuda.amp.autocast(enabled=scaler is not None):
Epoch: [1]  [ 0/70]  eta: 0:17:38  lr: 0.005000  loss: 3.2332 (3.2332)  loss_classifier: 0.6616 (0.6616)  loss_box_reg: 0.5221 (0.5221)  loss_objectness: 1.9527 (1.9527)  loss_rpn_box_reg: 0.0968 (0.0968)  time: 15.1189  data: 14.0877  max mem: 7857
Epoch: [1]  [10/70]  eta: 0:15:42  lr: 0.005000  loss: 1.4919 (1.7196)  loss_classifier: 0.4717 (0.4635)  loss_box_reg: 0.5568 (0.5668)  loss_objectness: 0.4557 (0.6076)  loss_rpn_box_reg: 0.0749 (0.0816)  time: 15.7142  data: 14.9562  max mem: 8029
Epoch: [1]  [20/70]  eta: 0:13:08  lr: 0.005000  loss: 1.2126 (1.3953)  loss_classifier: 0.3121 (0.3739)  loss_box_reg: 0.5044 (0.5462)  loss_objectness: 0.2501 (0.4009)  loss_rpn_box_reg: 0.0551 (0.0743)  time: 15.8020  data: 15.0573  max mem: 8029
Epoch: [1]  [30/70]  eta: 0:10:16  lr: 0.005000  loss: 0.8895 (1.1661)  loss_classifier: 0.2166 (0.3068)  loss_box_reg: 0.4733 (0.4921)  loss_objectness: 0.1004 (0.3014)  loss_rpn_box_reg: 0.0462 (0.0657)  time: 15.2296  data: 14.4679  max mem: 8029
Epoch: [1]  [40/70]  eta: 0:07:50  lr: 0.005000  loss: 0.6386 (1.0332)  loss_classifier: 0.1545 (0.2675)  loss_box_reg: 0.3661 (0.4596)  loss_objectness: 0.0543 (0.2459)  loss_rpn_box_reg: 0.0424 (0.0601)  time: 15.5631  data: 14.8146  max mem: 8029
Epoch: [1]  [50/70]  eta: 0:05:14  lr: 0.005000  loss: 0.5859 (0.9429)  loss_classifier: 0.1366 (0.2417)  loss_box_reg: 0.3462 (0.4324)  loss_objectness: 0.0698 (0.2116)  loss_rpn_box_reg: 0.0456 (0.0572)  time: 16.1752  data: 15.4312  max mem: 8029
Epoch: [1]  [60/70]  eta: 0:02:34  lr: 0.005000  loss: 0.5046 (0.8723)  loss_classifier: 0.1232 (0.2251)  loss_box_reg: 0.2963 (0.4094)  loss_objectness: 0.0452 (0.1844)  loss_rpn_box_reg: 0.0343 (0.0534)  time: 15.0471  data: 14.2814  max mem: 8029
Epoch: [1]  [69/70]  eta: 0:00:15  lr: 0.005000  loss: 0.5259 (0.8405)  loss_classifier: 0.1410 (0.2160)  loss_box_reg: 0.3144 (0.4018)  loss_objectness: 0.0351 (0.1689)  loss_rpn_box_reg: 0.0389 (0.0538)  time: 15.1424  data: 14.3768  max mem: 8029
Epoch: [1] Total time: 0:18:05 (15.5005 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3895 (0.3895)  evaluator_time: 0.9710 (0.9710)  time: 2.8201  data: 1.4578  max mem: 8029
Epoch:   0%|          | 0/20 [18:56<?, ?epoch/s, Train Loss=0.841, Test Loss=0.519, Train Accuracy=0.000, Test Accuracy=0.939]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3760 (0.3726)  evaluator_time: 0.2231 (0.2840)  time: 1.7438  data: 1.0852  max mem: 8029
Test: Total time: 0:00:31 (1.7439 s / it)
Averaged stats: model_time: 0.3760 (0.3726)  evaluator_time: 0.2231 (0.2840)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.519
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.939
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.525
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.603
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.018
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.174
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.602
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.057
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.604
Epoch:   5%|▌         | 1/20 [18:57<6:00:09, 1137.34s/epoch, Train Loss=0.841, Test Loss=0.519, Train Accuracy=0.000, Test Accuracy=0.939]
Epoch: [2]  [ 0/70]  eta: 0:14:26  lr: 0.005000  loss: 0.3349 (0.3349)  loss_classifier: 0.1098 (0.1098)  loss_box_reg: 0.1718 (0.1718)  loss_objectness: 0.0387 (0.0387)  loss_rpn_box_reg: 0.0146 (0.0146)  time: 12.3805  data: 11.5911  max mem: 8029
Epoch: [2]  [10/70]  eta: 0:16:21  lr: 0.005000  loss: 0.5340 (0.5607)  loss_classifier: 0.1341 (0.1472)  loss_box_reg: 0.3160 (0.3241)  loss_objectness: 0.0387 (0.0441)  loss_rpn_box_reg: 0.0451 (0.0454)  time: 16.3650  data: 15.6016  max mem: 8029
Epoch: [2]  [20/70]  eta: 0:12:46  lr: 0.005000  loss: 0.5230 (0.5194)  loss_classifier: 0.1341 (0.1370)  loss_box_reg: 0.3067 (0.2958)  loss_objectness: 0.0385 (0.0447)  loss_rpn_box_reg: 0.0392 (0.0419)  time: 15.4750  data: 14.7076  max mem: 8029
Epoch: [2]  [30/70]  eta: 0:10:21  lr: 0.005000  loss: 0.4946 (0.5198)  loss_classifier: 0.1371 (0.1345)  loss_box_reg: 0.2767 (0.3028)  loss_objectness: 0.0366 (0.0417)  loss_rpn_box_reg: 0.0342 (0.0408)  time: 15.0846  data: 14.3130  max mem: 8029
Epoch: [2]  [40/70]  eta: 0:07:43  lr: 0.005000  loss: 0.5534 (0.5172)  loss_classifier: 0.1282 (0.1303)  loss_box_reg: 0.3298 (0.3046)  loss_objectness: 0.0366 (0.0409)  loss_rpn_box_reg: 0.0384 (0.0414)  time: 15.5668  data: 14.8054  max mem: 8029
Epoch: [2]  [50/70]  eta: 0:05:07  lr: 0.005000  loss: 0.4957 (0.5097)  loss_classifier: 0.1157 (0.1278)  loss_box_reg: 0.3025 (0.3022)  loss_objectness: 0.0258 (0.0393)  loss_rpn_box_reg: 0.0371 (0.0405)  time: 15.1364  data: 14.3765  max mem: 8029
Epoch: [2]  [60/70]  eta: 0:02:34  lr: 0.005000  loss: 0.4740 (0.5043)  loss_classifier: 0.1135 (0.1276)  loss_box_reg: 0.2922 (0.2986)  loss_objectness: 0.0258 (0.0384)  loss_rpn_box_reg: 0.0333 (0.0397)  time: 15.4349  data: 14.6800  max mem: 8030
Epoch: [2]  [69/70]  eta: 0:00:15  lr: 0.005000  loss: 0.4287 (0.4996)  loss_classifier: 0.1134 (0.1266)  loss_box_reg: 0.2804 (0.2973)  loss_objectness: 0.0233 (0.0365)  loss_rpn_box_reg: 0.0319 (0.0393)  time: 14.8687  data: 14.1221  max mem: 8030
Epoch: [2] Total time: 0:17:51 (15.3026 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:52  model_time: 0.3858 (0.3858)  evaluator_time: 0.9602 (0.9602)  time: 2.8910  data: 1.5433  max mem: 8030
Epoch:   5%|▌         | 1/20 [37:41<6:00:09, 1137.34s/epoch, Train Loss=0.500, Test Loss=0.573, Train Accuracy=0.000, Test Accuracy=0.948]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3776 (0.3743)  evaluator_time: 0.2167 (0.2716)  time: 1.7857  data: 1.1378  max mem: 8030
Test: Total time: 0:00:32 (1.7858 s / it)
Averaged stats: model_time: 0.3776 (0.3743)  evaluator_time: 0.2167 (0.2716)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.573
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.948
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.643
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.001
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.644
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.190
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.641
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.057
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.643
Epoch:  10%|█         | 2/20 [37:42<5:39:01, 1130.11s/epoch, Train Loss=0.500, Test Loss=0.573, Train Accuracy=0.000, Test Accuracy=0.948]
Epoch: [3]  [ 0/70]  eta: 0:15:51  lr: 0.005000  loss: 0.4577 (0.4577)  loss_classifier: 0.1326 (0.1326)  loss_box_reg: 0.2820 (0.2820)  loss_objectness: 0.0178 (0.0178)  loss_rpn_box_reg: 0.0252 (0.0252)  time: 13.5945  data: 12.7850  max mem: 8030
Epoch: [3]  [10/70]  eta: 0:14:57  lr: 0.005000  loss: 0.4520 (0.4416)  loss_classifier: 0.1279 (0.1168)  loss_box_reg: 0.2820 (0.2662)  loss_objectness: 0.0209 (0.0254)  loss_rpn_box_reg: 0.0286 (0.0332)  time: 14.9520  data: 14.1719  max mem: 8030
Epoch: [3]  [20/70]  eta: 0:12:46  lr: 0.005000  loss: 0.4364 (0.4374)  loss_classifier: 0.1161 (0.1133)  loss_box_reg: 0.2754 (0.2663)  loss_objectness: 0.0209 (0.0237)  loss_rpn_box_reg: 0.0318 (0.0342)  time: 15.4133  data: 14.6347  max mem: 8030
Epoch: [3]  [30/70]  eta: 0:10:18  lr: 0.005000  loss: 0.4135 (0.4133)  loss_classifier: 0.1025 (0.1075)  loss_box_reg: 0.2627 (0.2530)  loss_objectness: 0.0188 (0.0209)  loss_rpn_box_reg: 0.0290 (0.0319)  time: 15.7276  data: 14.9473  max mem: 8030
Epoch: [3]  [40/70]  eta: 0:07:49  lr: 0.005000  loss: 0.3894 (0.4185)  loss_classifier: 0.1029 (0.1074)  loss_box_reg: 0.2414 (0.2583)  loss_objectness: 0.0160 (0.0212)  loss_rpn_box_reg: 0.0275 (0.0316)  time: 15.9733  data: 15.2049  max mem: 8030
Epoch: [3]  [50/70]  eta: 0:05:14  lr: 0.005000  loss: 0.4335 (0.4227)  loss_classifier: 0.1076 (0.1092)  loss_box_reg: 0.2634 (0.2593)  loss_objectness: 0.0185 (0.0222)  loss_rpn_box_reg: 0.0256 (0.0319)  time: 16.1980  data: 15.4426  max mem: 8030
Epoch: [3]  [60/70]  eta: 0:02:36  lr: 0.005000  loss: 0.4335 (0.4263)  loss_classifier: 0.1216 (0.1122)  loss_box_reg: 0.2724 (0.2607)  loss_objectness: 0.0166 (0.0218)  loss_rpn_box_reg: 0.0254 (0.0316)  time: 15.6932  data: 14.9200  max mem: 8030
Epoch: [4]  [ 0/70]  eta: 0:17:23  lr: 0.000500  loss: 0.2714 (0.2714)  loss_classifier: 0.0745 (0.0745)  loss_box_reg: 0.1744 (0.1744)  loss_objectness: 0.0083 (0.0083)  loss_rpn_box_reg: 0.0143 (0.0143)  time: 14.9037  data: 14.0976  max mem: 8030
Epoch: [4]  [10/70]  eta: 0:16:11  lr: 0.000500  loss: 0.3816 (0.3980)  loss_classifier: 0.0908 (0.0998)  loss_box_reg: 0.2546 (0.2431)  loss_objectness: 0.0169 (0.0239)  loss_rpn_box_reg: 0.0270 (0.0312)  time: 16.1879  data: 15.4277  max mem: 8030
Epoch: [4]  [20/70]  eta: 0:12:42  lr: 0.000500  loss: 0.3707 (0.3828)  loss_classifier: 0.0908 (0.0974)  loss_box_reg: 0.2391 (0.2354)  loss_objectness: 0.0169 (0.0218)  loss_rpn_box_reg: 0.0238 (0.0282)  time: 15.2707  data: 14.5134  max mem: 8030
Epoch: [4]  [30/70]  eta: 0:10:24  lr: 0.000500  loss: 0.3507 (0.3781)  loss_classifier: 0.0863 (0.0956)  loss_box_reg: 0.2309 (0.2349)  loss_objectness: 0.0154 (0.0200)  loss_rpn_box_reg: 0.0227 (0.0275)  time: 15.2926  data: 14.5223  max mem: 8030
Epoch: [4]  [40/70]  eta: 0:07:36  lr: 0.000500  loss: 0.3743 (0.3752)  loss_classifier: 0.0882 (0.0954)  loss_box_reg: 0.2319 (0.2343)  loss_objectness: 0.0137 (0.0193)  loss_rpn_box_reg: 0.0244 (0.0263)  time: 15.1888  data: 14.4141  max mem: 8030
Epoch: [4]  [50/70]  eta: 0:05:02  lr: 0.000500  loss: 0.3743 (0.3796)  loss_classifier: 0.0979 (0.0981)  loss_box_reg: 0.2323 (0.2360)  loss_objectness: 0.0175 (0.0188)  loss_rpn_box_reg: 0.0243 (0.0266)  time: 14.4223  data: 13.6610  max mem: 8030
Epoch: [4]  [60/70]  eta: 0:02:32  lr: 0.000500  loss: 0.3447 (0.3825)  loss_classifier: 0.0996 (0.0982)  loss_box_reg: 0.2239 (0.2375)  loss_objectness: 0.0157 (0.0189)  loss_rpn_box_reg: 0.0249 (0.0278)  time: 15.3562  data: 14.6008  max mem: 8030
Epoch: [4]  [69/70]  eta: 0:00:15  lr: 0.000500  loss: 0.3327 (0.3821)  loss_classifier: 0.0913 (0.0981)  loss_box_reg: 0.2239 (0.2368)  loss_objectness: 0.0147 (0.0188)  loss_rpn_box_reg: 0.0234 (0.0283)  time: 14.8779  data: 14.1158  max mem: 8030
Epoch: [4] Total time: 0:17:36 (15.0895 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:49  model_time: 0.3861 (0.3861)  evaluator_time: 0.9473 (0.9473)  time: 2.7638  data: 1.4287  max mem: 8030
Epoch:  15%|█▌        | 3/20 [1:15:09<5:21:39, 1135.27s/epoch, Train Loss=0.382, Test Loss=0.581, Train Accuracy=0.000, Test Accuracy=0.958]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3758 (0.3724)  evaluator_time: 0.2049 (0.2651)  time: 1.7110  data: 1.0716  max mem: 8030
Test: Total time: 0:00:30 (1.7111 s / it)
Averaged stats: model_time: 0.3758 (0.3724)  evaluator_time: 0.2049 (0.2651)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.581
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.958
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.657
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.002
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.650
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.020
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.191
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.649
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.129
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.650
Epoch:  20%|██        | 4/20 [1:15:10<4:59:44, 1124.02s/epoch, Train Loss=0.382, Test Loss=0.581, Train Accuracy=0.000, Test Accuracy=0.958]
Epoch: [5]  [ 0/70]  eta: 0:21:06  lr: 0.000500  loss: 0.3157 (0.3157)  loss_classifier: 0.0554 (0.0554)  loss_box_reg: 0.2132 (0.2132)  loss_objectness: 0.0195 (0.0195)  loss_rpn_box_reg: 0.0276 (0.0276)  time: 18.0962  data: 17.4030  max mem: 8030
Epoch: [5]  [10/70]  eta: 0:15:39  lr: 0.000500  loss: 0.3820 (0.3963)  loss_classifier: 0.0997 (0.1030)  loss_box_reg: 0.2300 (0.2440)  loss_objectness: 0.0156 (0.0168)  loss_rpn_box_reg: 0.0295 (0.0325)  time: 15.6614  data: 14.9033  max mem: 8030
Epoch: [5]  [20/70]  eta: 0:12:58  lr: 0.000500  loss: 0.3562 (0.3868)  loss_classifier: 0.0962 (0.0996)  loss_box_reg: 0.2300 (0.2395)  loss_objectness: 0.0113 (0.0154)  loss_rpn_box_reg: 0.0249 (0.0323)  time: 15.4366  data: 14.6783  max mem: 8030
Epoch: [5]  [30/70]  eta: 0:10:21  lr: 0.000500  loss: 0.3514 (0.3844)  loss_classifier: 0.0855 (0.0967)  loss_box_reg: 0.2314 (0.2411)  loss_objectness: 0.0139 (0.0158)  loss_rpn_box_reg: 0.0253 (0.0308)  time: 15.4527  data: 14.6870  max mem: 8030
Epoch: [5]  [40/70]  eta: 0:07:38  lr: 0.000500  loss: 0.3529 (0.3765)  loss_classifier: 0.0843 (0.0978)  loss_box_reg: 0.2206 (0.2341)  loss_objectness: 0.0144 (0.0156)  loss_rpn_box_reg: 0.0258 (0.0290)  time: 14.9956  data: 14.2181  max mem: 8030
Epoch: [5]  [50/70]  eta: 0:05:02  lr: 0.000500  loss: 0.3352 (0.3628)  loss_classifier: 0.0804 (0.0929)  loss_box_reg: 0.1941 (0.2278)  loss_objectness: 0.0100 (0.0152)  loss_rpn_box_reg: 0.0190 (0.0269)  time: 14.4480  data: 13.6721  max mem: 8030
Epoch: [5]  [60/70]  eta: 0:02:31  lr: 0.000500  loss: 0.3147 (0.3629)  loss_classifier: 0.0804 (0.0927)  loss_box_reg: 0.2112 (0.2278)  loss_objectness: 0.0086 (0.0153)  loss_rpn_box_reg: 0.0190 (0.0270)  time: 14.7789  data: 14.0017  max mem: 8030
Epoch: [5]  [69/70]  eta: 0:00:15  lr: 0.000500  loss: 0.3554 (0.3694)  loss_classifier: 0.0926 (0.0943)  loss_box_reg: 0.2361 (0.2309)  loss_objectness: 0.0132 (0.0165)  loss_rpn_box_reg: 0.0243 (0.0277)  time: 15.0526  data: 14.2949  max mem: 8030
Epoch: [5] Total time: 0:17:39 (15.1349 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:48  model_time: 0.3762 (0.3762)  evaluator_time: 0.9318 (0.9318)  time: 2.6958  data: 1.3859  max mem: 8030
Epoch:  20%|██        | 4/20 [1:33:39<4:59:44, 1124.02s/epoch, Train Loss=0.369, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.958]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3762 (0.3722)  evaluator_time: 0.1945 (0.2562)  time: 1.6719  data: 1.0416  max mem: 8030
Test: Total time: 0:00:30 (1.6720 s / it)
Averaged stats: model_time: 0.3762 (0.3722)  evaluator_time: 0.1945 (0.2562)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.586
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.958
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.661
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.653
Epoch:  25%|██▌       | 5/20 [1:33:40<4:39:42, 1118.85s/epoch, Train Loss=0.369, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.958]
Epoch: [6]  [ 0/70]  eta: 0:23:32  lr: 0.000500  loss: 0.3549 (0.3549)  loss_classifier: 0.0845 (0.0845)  loss_box_reg: 0.2291 (0.2291)  loss_objectness: 0.0199 (0.0199)  loss_rpn_box_reg: 0.0215 (0.0215)  time: 20.1744  data: 19.4753  max mem: 8030
Epoch: [6]  [10/70]  eta: 0:16:23  lr: 0.000500  loss: 0.3399 (0.3449)  loss_classifier: 0.0810 (0.0861)  loss_box_reg: 0.2192 (0.2220)  loss_objectness: 0.0106 (0.0131)  loss_rpn_box_reg: 0.0224 (0.0236)  time: 16.3859  data: 15.6274  max mem: 8030
Epoch: [6]  [20/70]  eta: 0:13:12  lr: 0.000500  loss: 0.3399 (0.3515)  loss_classifier: 0.0911 (0.0917)  loss_box_reg: 0.2156 (0.2206)  loss_objectness: 0.0110 (0.0140)  loss_rpn_box_reg: 0.0224 (0.0251)  time: 15.6400  data: 14.8629  max mem: 8030
Epoch: [6]  [30/70]  eta: 0:10:22  lr: 0.000500  loss: 0.3397 (0.3389)  loss_classifier: 0.0909 (0.0900)  loss_box_reg: 0.1985 (0.2120)  loss_objectness: 0.0128 (0.0132)  loss_rpn_box_reg: 0.0186 (0.0238)  time: 15.1102  data: 14.3225  max mem: 8030
Epoch: [6]  [40/70]  eta: 0:07:45  lr: 0.000500  loss: 0.3498 (0.3517)  loss_classifier: 0.0855 (0.0914)  loss_box_reg: 0.2202 (0.2220)  loss_objectness: 0.0102 (0.0134)  loss_rpn_box_reg: 0.0239 (0.0249)  time: 15.1516  data: 14.3647  max mem: 8030
Epoch: [6]  [50/70]  eta: 0:05:12  lr: 0.000500  loss: 0.3851 (0.3588)  loss_classifier: 0.0889 (0.0915)  loss_box_reg: 0.2404 (0.2275)  loss_objectness: 0.0122 (0.0140)  loss_rpn_box_reg: 0.0258 (0.0258)  time: 15.7605  data: 14.9888  max mem: 8030
Epoch: [6]  [60/70]  eta: 0:02:36  lr: 0.000500  loss: 0.3852 (0.3647)  loss_classifier: 0.0973 (0.0930)  loss_box_reg: 0.2402 (0.2297)  loss_objectness: 0.0145 (0.0146)  loss_rpn_box_reg: 0.0273 (0.0273)  time: 15.8815  data: 15.1141  max mem: 8030
Epoch: [6]  [69/70]  eta: 0:00:15  lr: 0.000500  loss: 0.3888 (0.3663)  loss_classifier: 0.0973 (0.0935)  loss_box_reg: 0.2462 (0.2310)  loss_objectness: 0.0145 (0.0145)  loss_rpn_box_reg: 0.0276 (0.0273)  time: 15.2576  data: 14.4892  max mem: 8030
Epoch: [6] Total time: 0:18:06 (15.5219 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:55  model_time: 0.3893 (0.3893)  evaluator_time: 0.9429 (0.9429)  time: 3.0642  data: 1.7303  max mem: 8030
Epoch:  25%|██▌       | 5/20 [1:52:38<4:39:42, 1118.85s/epoch, Train Loss=0.366, Test Loss=0.585, Train Accuracy=0.000, Test Accuracy=0.950]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3762 (0.3683)  evaluator_time: 0.1948 (0.2594)  time: 1.7508  data: 1.1212  max mem: 8030
Test: Total time: 0:00:31 (1.7509 s / it)
Averaged stats: model_time: 0.3762 (0.3683)  evaluator_time: 0.1948 (0.2594)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.585
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.950
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.663
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.001
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.020
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.191
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.649
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.086
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
Epoch:  25%|██▌       | 5/20 [1:52:39<5:37:57, 1351.84s/epoch, Train Loss=0.366, Test Loss=0.585, Train Accuracy=0.000, Test Accuracy=0.950]
Early stopping activated.


 
Finished Training

Entrenamiento sin data-augmentation¶

En este punto, se realiza una prueba sin realizar data-augmentation:

In [39]:
def get_transforms(train):
    transforms = []
    if train:
        pass
    transforms.append(T.Resize(size=IMG_SIZE))
    transforms.append(T.ToDtype(torch.float, scale=True))
    transforms.append(T.ToPureTensor())
    return T.Compose(transforms)

# Aplicamos las transformaciones.
train_dataset = TransformedPalmTreeDataset(train_dataset_raw, transforms=get_transforms(train=True))
test_dataset = TransformedPalmTreeDataset(test_dataset_raw, transforms=get_transforms(train=False))

save_checkpoint_path = os.path.join(RUTA_CARPETA_MODELS, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG + '_checkpoint.tar')
if LOAD_MODEL_FROM_CHECKPOINT and os.path.exists(save_checkpoint_path):
    model, optimizer, lr_scheduler, checkpoint_data = get_fasterrcnn_resnet50_fpn_v2_model(len(LABELS.keys()), save_checkpoint_path, DEVICE)
else:
    model, optimizer, lr_scheduler, checkpoint_data = get_fasterrcnn_resnet50_fpn_v2_model(len(LABELS.keys()), None, DEVICE)
    
EPOCHS = 20
BATCH_SIZE = 4

train_dataloader = DataLoader(train_dataset, batch_size=BATCH_SIZE, shuffle=True, collate_fn=lambda x: tuple(zip(*x)))
test_dataloader = DataLoader(test_dataset, batch_size=BATCH_SIZE, shuffle=False, collate_fn=lambda x: tuple(zip(*x)))

train_losses, test_losses, train_accs, test_accs = fit(
    model=model,
    train_dataloader=train_dataloader,
    test_dataloader=test_dataloader,
    optimizer=optimizer,
    criterion=None,
    scheduler=lr_scheduler,
    early_stopping=EarlyStopper(10, 0),
    epochs=EPOCHS,
    verbose=False,
    device=DEVICE,
    start_from_checkpoint_data=checkpoint_data,
    base_dir=os.path.join(
        RUTA_CARPETA_MODELS,
        FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG),
    model_id=FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG,
    save_checkpoint=LOAD_MODEL_FROM_CHECKPOINT,
    save_best_model = True,
    save_results = True
)
Cargando el modelo desde el checkpoint: ./resources/models/fasterrcnn_resnet50_fpn_v2_sin_aug/fasterrcnn_resnet50_fpn_v2_sin_aug_checkpoint.tar

Modelo cargado desde el checkpoint en la época 7.

Traning from checkpoint at epoch 8

Epoch:   0%|          | 0/13 [00:00<?, ?epoch/s]
Epoch: [8]  [ 0/70]  eta: 0:01:40  lr: 0.000050  loss: 0.2994 (0.2994)  loss_classifier: 0.0674 (0.0674)  loss_box_reg: 0.2092 (0.2092)  loss_objectness: 0.0031 (0.0031)  loss_rpn_box_reg: 0.0197 (0.0197)  time: 1.4408  data: 0.6269  max mem: 8870
Epoch: [8]  [10/70]  eta: 0:01:31  lr: 0.000050  loss: 0.3304 (0.3396)  loss_classifier: 0.0790 (0.0794)  loss_box_reg: 0.2266 (0.2224)  loss_objectness: 0.0100 (0.0127)  loss_rpn_box_reg: 0.0224 (0.0251)  time: 1.5230  data: 0.7672  max mem: 8870
Epoch: [8]  [20/70]  eta: 0:01:15  lr: 0.000050  loss: 0.3369 (0.3423)  loss_classifier: 0.0872 (0.0845)  loss_box_reg: 0.2266 (0.2206)  loss_objectness: 0.0079 (0.0113)  loss_rpn_box_reg: 0.0231 (0.0259)  time: 1.5228  data: 0.7672  max mem: 8870
Epoch: [8]  [30/70]  eta: 0:01:00  lr: 0.000050  loss: 0.3567 (0.3502)  loss_classifier: 0.0937 (0.0891)  loss_box_reg: 0.2295 (0.2231)  loss_objectness: 0.0084 (0.0116)  loss_rpn_box_reg: 0.0262 (0.0264)  time: 1.4925  data: 0.7328  max mem: 8870
Epoch: [8]  [40/70]  eta: 0:00:45  lr: 0.000050  loss: 0.3480 (0.3534)  loss_classifier: 0.0981 (0.0910)  loss_box_reg: 0.2202 (0.2235)  loss_objectness: 0.0120 (0.0120)  loss_rpn_box_reg: 0.0243 (0.0268)  time: 1.5000  data: 0.7447  max mem: 8870
Epoch: [8]  [50/70]  eta: 0:00:30  lr: 0.000050  loss: 0.3467 (0.3509)  loss_classifier: 0.0923 (0.0909)  loss_box_reg: 0.2202 (0.2222)  loss_objectness: 0.0107 (0.0117)  loss_rpn_box_reg: 0.0215 (0.0260)  time: 1.5053  data: 0.7600  max mem: 8870
Epoch: [8]  [60/70]  eta: 0:00:15  lr: 0.000050  loss: 0.3504 (0.3455)  loss_classifier: 0.0918 (0.0887)  loss_box_reg: 0.2310 (0.2200)  loss_objectness: 0.0092 (0.0112)  loss_rpn_box_reg: 0.0222 (0.0257)  time: 1.5023  data: 0.7403  max mem: 8870
Epoch: [8]  [69/70]  eta: 0:00:01  lr: 0.000050  loss: 0.3598 (0.3468)  loss_classifier: 0.0918 (0.0898)  loss_box_reg: 0.2155 (0.2201)  loss_objectness: 0.0090 (0.0115)  loss_rpn_box_reg: 0.0220 (0.0254)  time: 1.4737  data: 0.7056  max mem: 8870
Epoch: [8] Total time: 0:01:44 (1.4975 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3934 (0.3934)  evaluator_time: 0.9314 (0.9314)  time: 2.7986  data: 1.4720  max mem: 8870
Epoch:   0%|          | 0/13 [02:35<?, ?epoch/s, Train Loss=0.347, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3763 (0.3740)  evaluator_time: 0.1961 (0.2544)  time: 1.6986  data: 1.0682  max mem: 8870
Test: Total time: 0:00:30 (1.6987 s / it)
Averaged stats: model_time: 0.3763 (0.3740)  evaluator_time: 0.1961 (0.2544)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.588
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.958
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.671
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.193
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.653
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.157
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.655
Epoch:   8%|▊         | 1/13 [02:36<31:15, 156.26s/epoch, Train Loss=0.347, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Epoch: [9]  [ 0/70]  eta: 0:01:38  lr: 0.000050  loss: 0.2587 (0.2587)  loss_classifier: 0.0728 (0.0728)  loss_box_reg: 0.1634 (0.1634)  loss_objectness: 0.0135 (0.0135)  loss_rpn_box_reg: 0.0090 (0.0090)  time: 1.4105  data: 0.6225  max mem: 8870
Epoch: [9]  [10/70]  eta: 0:01:31  lr: 0.000050  loss: 0.3870 (0.3617)  loss_classifier: 0.1027 (0.0939)  loss_box_reg: 0.2426 (0.2265)  loss_objectness: 0.0135 (0.0133)  loss_rpn_box_reg: 0.0284 (0.0279)  time: 1.5197  data: 0.7438  max mem: 8870
Epoch: [9]  [20/70]  eta: 0:01:14  lr: 0.000050  loss: 0.3322 (0.3345)  loss_classifier: 0.0751 (0.0868)  loss_box_reg: 0.2172 (0.2120)  loss_objectness: 0.0106 (0.0116)  loss_rpn_box_reg: 0.0234 (0.0240)  time: 1.4992  data: 0.7371  max mem: 8870
Epoch: [9]  [30/70]  eta: 0:00:59  lr: 0.000050  loss: 0.3096 (0.3379)  loss_classifier: 0.0760 (0.0886)  loss_box_reg: 0.1983 (0.2131)  loss_objectness: 0.0081 (0.0121)  loss_rpn_box_reg: 0.0198 (0.0242)  time: 1.4855  data: 0.7227  max mem: 8870
Epoch: [9]  [40/70]  eta: 0:00:44  lr: 0.000050  loss: 0.3363 (0.3419)  loss_classifier: 0.0885 (0.0890)  loss_box_reg: 0.2160 (0.2160)  loss_objectness: 0.0095 (0.0120)  loss_rpn_box_reg: 0.0241 (0.0248)  time: 1.4930  data: 0.7293  max mem: 8870
Epoch: [9]  [50/70]  eta: 0:00:29  lr: 0.000050  loss: 0.3523 (0.3426)  loss_classifier: 0.0903 (0.0896)  loss_box_reg: 0.2160 (0.2144)  loss_objectness: 0.0113 (0.0127)  loss_rpn_box_reg: 0.0280 (0.0259)  time: 1.4838  data: 0.7327  max mem: 8870
Epoch: [9]  [60/70]  eta: 0:00:14  lr: 0.000050  loss: 0.3523 (0.3494)  loss_classifier: 0.0935 (0.0913)  loss_box_reg: 0.2306 (0.2194)  loss_objectness: 0.0113 (0.0127)  loss_rpn_box_reg: 0.0256 (0.0260)  time: 1.4831  data: 0.7322  max mem: 8870
Epoch: [9]  [69/70]  eta: 0:00:01  lr: 0.000050  loss: 0.3244 (0.3447)  loss_classifier: 0.0804 (0.0901)  loss_box_reg: 0.2086 (0.2165)  loss_objectness: 0.0095 (0.0124)  loss_rpn_box_reg: 0.0227 (0.0256)  time: 1.4836  data: 0.7308  max mem: 8870
Epoch: [9] Total time: 0:01:44 (1.4926 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:51  model_time: 0.3951 (0.3951)  evaluator_time: 0.9898 (0.9898)  time: 2.8577  data: 1.4710  max mem: 8870
Epoch:   8%|▊         | 1/13 [05:10<31:15, 156.26s/epoch, Train Loss=0.345, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3791 (0.3750)  evaluator_time: 0.2029 (0.2608)  time: 1.6965  data: 1.0588  max mem: 8870
Test: Total time: 0:00:30 (1.6966 s / it)
Averaged stats: model_time: 0.3791 (0.3750)  evaluator_time: 0.2029 (0.2608)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.588
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.958
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.670
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.193
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.653
Epoch:  15%|█▌        | 2/13 [05:11<28:32, 155.65s/epoch, Train Loss=0.345, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Epoch: [10]  [ 0/70]  eta: 0:01:47  lr: 0.000005  loss: 0.2679 (0.2679)  loss_classifier: 0.0770 (0.0770)  loss_box_reg: 0.1590 (0.1590)  loss_objectness: 0.0076 (0.0076)  loss_rpn_box_reg: 0.0243 (0.0243)  time: 1.5426  data: 0.7452  max mem: 8870
Epoch: [10]  [10/70]  eta: 0:01:28  lr: 0.000005  loss: 0.3374 (0.3447)  loss_classifier: 0.0794 (0.0919)  loss_box_reg: 0.2322 (0.2168)  loss_objectness: 0.0091 (0.0100)  loss_rpn_box_reg: 0.0263 (0.0261)  time: 1.4817  data: 0.7246  max mem: 8870
Epoch: [10]  [20/70]  eta: 0:01:12  lr: 0.000005  loss: 0.3148 (0.3370)  loss_classifier: 0.0849 (0.0909)  loss_box_reg: 0.2067 (0.2106)  loss_objectness: 0.0081 (0.0104)  loss_rpn_box_reg: 0.0226 (0.0250)  time: 1.4476  data: 0.6843  max mem: 8870
Epoch: [10]  [30/70]  eta: 0:00:58  lr: 0.000005  loss: 0.3496 (0.3491)  loss_classifier: 0.0849 (0.0938)  loss_box_reg: 0.2214 (0.2173)  loss_objectness: 0.0103 (0.0115)  loss_rpn_box_reg: 0.0264 (0.0266)  time: 1.4506  data: 0.7005  max mem: 8870
Epoch: [10]  [40/70]  eta: 0:00:43  lr: 0.000005  loss: 0.3679 (0.3557)  loss_classifier: 0.1007 (0.0945)  loss_box_reg: 0.2374 (0.2230)  loss_objectness: 0.0116 (0.0111)  loss_rpn_box_reg: 0.0294 (0.0271)  time: 1.4573  data: 0.7062  max mem: 8870
Epoch: [10]  [50/70]  eta: 0:00:29  lr: 0.000005  loss: 0.3650 (0.3495)  loss_classifier: 0.0994 (0.0930)  loss_box_reg: 0.2332 (0.2199)  loss_objectness: 0.0088 (0.0108)  loss_rpn_box_reg: 0.0219 (0.0259)  time: 1.4505  data: 0.6813  max mem: 8870
Epoch: [10]  [60/70]  eta: 0:00:14  lr: 0.000005  loss: 0.3208 (0.3476)  loss_classifier: 0.0767 (0.0914)  loss_box_reg: 0.2044 (0.2191)  loss_objectness: 0.0080 (0.0111)  loss_rpn_box_reg: 0.0198 (0.0260)  time: 1.4648  data: 0.7071  max mem: 8870
Epoch: [10]  [69/70]  eta: 0:00:01  lr: 0.000005  loss: 0.3208 (0.3413)  loss_classifier: 0.0767 (0.0891)  loss_box_reg: 0.2044 (0.2158)  loss_objectness: 0.0082 (0.0111)  loss_rpn_box_reg: 0.0256 (0.0253)  time: 1.4687  data: 0.7093  max mem: 8870
Epoch: [10] Total time: 0:01:42 (1.4612 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:48  model_time: 0.3918 (0.3918)  evaluator_time: 0.9290 (0.9290)  time: 2.6700  data: 1.3474  max mem: 8870
Epoch:  15%|█▌        | 2/13 [07:41<28:32, 155.65s/epoch, Train Loss=0.341, Test Loss=0.587, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3781 (0.3743)  evaluator_time: 0.1972 (0.2585)  time: 1.6473  data: 1.0126  max mem: 8870
Test: Total time: 0:00:29 (1.6474 s / it)
Averaged stats: model_time: 0.3781 (0.3743)  evaluator_time: 0.1972 (0.2585)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.587
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.669
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.171
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.654
Epoch:  23%|██▎       | 3/13 [07:42<25:34, 153.48s/epoch, Train Loss=0.341, Test Loss=0.587, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [11]  [ 0/70]  eta: 0:01:55  lr: 0.000005  loss: 0.5099 (0.5099)  loss_classifier: 0.1481 (0.1481)  loss_box_reg: 0.3039 (0.3039)  loss_objectness: 0.0116 (0.0116)  loss_rpn_box_reg: 0.0463 (0.0463)  time: 1.6438  data: 0.8500  max mem: 8870
Epoch: [11]  [10/70]  eta: 0:01:33  lr: 0.000005  loss: 0.3831 (0.3723)  loss_classifier: 0.0955 (0.0951)  loss_box_reg: 0.2453 (0.2389)  loss_objectness: 0.0112 (0.0114)  loss_rpn_box_reg: 0.0289 (0.0268)  time: 1.5542  data: 0.8311  max mem: 8870
Epoch: [11]  [20/70]  eta: 0:01:16  lr: 0.000005  loss: 0.3591 (0.3667)  loss_classifier: 0.0907 (0.0951)  loss_box_reg: 0.2209 (0.2316)  loss_objectness: 0.0111 (0.0132)  loss_rpn_box_reg: 0.0241 (0.0267)  time: 1.5265  data: 0.7866  max mem: 8870
Epoch: [11]  [30/70]  eta: 0:01:00  lr: 0.000005  loss: 0.3244 (0.3519)  loss_classifier: 0.0869 (0.0924)  loss_box_reg: 0.2159 (0.2224)  loss_objectness: 0.0094 (0.0121)  loss_rpn_box_reg: 0.0211 (0.0249)  time: 1.5052  data: 0.7248  max mem: 8870
Epoch: [11]  [40/70]  eta: 0:00:45  lr: 0.000005  loss: 0.2995 (0.3401)  loss_classifier: 0.0823 (0.0868)  loss_box_reg: 0.2065 (0.2169)  loss_objectness: 0.0095 (0.0120)  loss_rpn_box_reg: 0.0203 (0.0244)  time: 1.5249  data: 0.7380  max mem: 8871
Epoch: [11]  [50/70]  eta: 0:00:30  lr: 0.000005  loss: 0.2951 (0.3386)  loss_classifier: 0.0792 (0.0869)  loss_box_reg: 0.2013 (0.2158)  loss_objectness: 0.0103 (0.0119)  loss_rpn_box_reg: 0.0208 (0.0240)  time: 1.5220  data: 0.7574  max mem: 8871
Epoch: [11]  [60/70]  eta: 0:00:15  lr: 0.000005  loss: 0.3643 (0.3480)  loss_classifier: 0.0992 (0.0900)  loss_box_reg: 0.2360 (0.2206)  loss_objectness: 0.0101 (0.0122)  loss_rpn_box_reg: 0.0236 (0.0251)  time: 1.5235  data: 0.7702  max mem: 8871
Epoch: [11]  [69/70]  eta: 0:00:01  lr: 0.000005  loss: 0.3488 (0.3476)  loss_classifier: 0.0895 (0.0900)  loss_box_reg: 0.2105 (0.2196)  loss_objectness: 0.0102 (0.0123)  loss_rpn_box_reg: 0.0251 (0.0258)  time: 1.4963  data: 0.7467  max mem: 8871
Epoch: [11] Total time: 0:01:46 (1.5171 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3932 (0.3932)  evaluator_time: 0.9240 (0.9240)  time: 2.8081  data: 1.4891  max mem: 8871
Epoch:  23%|██▎       | 3/13 [10:19<25:34, 153.48s/epoch, Train Loss=0.348, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3786 (0.3752)  evaluator_time: 0.1958 (0.2584)  time: 1.7207  data: 1.0852  max mem: 8871
Test: Total time: 0:00:30 (1.7208 s / it)
Averaged stats: model_time: 0.3786 (0.3752)  evaluator_time: 0.1958 (0.2584)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.588
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.670
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.193
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.129
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.654
Epoch:  31%|███       | 4/13 [10:20<23:16, 155.20s/epoch, Train Loss=0.348, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [12]  [ 0/70]  eta: 0:01:33  lr: 0.000005  loss: 0.2908 (0.2908)  loss_classifier: 0.0856 (0.0856)  loss_box_reg: 0.1752 (0.1752)  loss_objectness: 0.0165 (0.0165)  loss_rpn_box_reg: 0.0135 (0.0135)  time: 1.3298  data: 0.5406  max mem: 8871
Epoch: [12]  [10/70]  eta: 0:01:30  lr: 0.000005  loss: 0.3086 (0.3305)  loss_classifier: 0.0857 (0.0810)  loss_box_reg: 0.1999 (0.2132)  loss_objectness: 0.0097 (0.0117)  loss_rpn_box_reg: 0.0200 (0.0247)  time: 1.5043  data: 0.7602  max mem: 8871
Epoch: [12]  [20/70]  eta: 0:01:15  lr: 0.000005  loss: 0.3441 (0.3470)  loss_classifier: 0.0876 (0.0898)  loss_box_reg: 0.2200 (0.2197)  loss_objectness: 0.0097 (0.0120)  loss_rpn_box_reg: 0.0266 (0.0255)  time: 1.5149  data: 0.7456  max mem: 8871
Epoch: [12]  [30/70]  eta: 0:01:00  lr: 0.000005  loss: 0.3607 (0.3539)  loss_classifier: 0.0861 (0.0897)  loss_box_reg: 0.2324 (0.2248)  loss_objectness: 0.0105 (0.0121)  loss_rpn_box_reg: 0.0277 (0.0274)  time: 1.5204  data: 0.7446  max mem: 8871
Epoch: [12]  [40/70]  eta: 0:00:45  lr: 0.000005  loss: 0.3573 (0.3499)  loss_classifier: 0.0861 (0.0916)  loss_box_reg: 0.2190 (0.2207)  loss_objectness: 0.0105 (0.0119)  loss_rpn_box_reg: 0.0265 (0.0257)  time: 1.5141  data: 0.7506  max mem: 8871
Epoch: [12]  [50/70]  eta: 0:00:30  lr: 0.000005  loss: 0.3405 (0.3456)  loss_classifier: 0.0960 (0.0897)  loss_box_reg: 0.2052 (0.2181)  loss_objectness: 0.0094 (0.0118)  loss_rpn_box_reg: 0.0226 (0.0259)  time: 1.4986  data: 0.7298  max mem: 8871
Epoch: [12]  [60/70]  eta: 0:00:15  lr: 0.000005  loss: 0.3363 (0.3459)  loss_classifier: 0.0912 (0.0899)  loss_box_reg: 0.2157 (0.2183)  loss_objectness: 0.0089 (0.0119)  loss_rpn_box_reg: 0.0227 (0.0258)  time: 1.5157  data: 0.7517  max mem: 8871
Epoch: [12]  [69/70]  eta: 0:00:01  lr: 0.000005  loss: 0.3426 (0.3454)  loss_classifier: 0.0923 (0.0893)  loss_box_reg: 0.2206 (0.2188)  loss_objectness: 0.0102 (0.0117)  loss_rpn_box_reg: 0.0247 (0.0256)  time: 1.5309  data: 0.7792  max mem: 8871
Epoch: [12] Total time: 0:01:46 (1.5148 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3897 (0.3897)  evaluator_time: 0.9351 (0.9351)  time: 2.8219  data: 1.4954  max mem: 8871
Epoch:  31%|███       | 4/13 [12:57<23:16, 155.20s/epoch, Train Loss=0.345, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3773 (0.3742)  evaluator_time: 0.1941 (0.2575)  time: 1.7135  data: 1.0799  max mem: 8871
Test: Total time: 0:00:30 (1.7135 s / it)
Averaged stats: model_time: 0.3773 (0.3742)  evaluator_time: 0.1941 (0.2575)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.586
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.668
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.650
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.193
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.653
Epoch:  38%|███▊      | 5/13 [12:57<20:48, 156.06s/epoch, Train Loss=0.345, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [13]  [ 0/70]  eta: 0:01:45  lr: 0.000001  loss: 0.3433 (0.3433)  loss_classifier: 0.0882 (0.0882)  loss_box_reg: 0.2191 (0.2191)  loss_objectness: 0.0105 (0.0105)  loss_rpn_box_reg: 0.0255 (0.0255)  time: 1.5110  data: 0.7203  max mem: 8871
Epoch: [13]  [10/70]  eta: 0:01:29  lr: 0.000001  loss: 0.3092 (0.3377)  loss_classifier: 0.0854 (0.0903)  loss_box_reg: 0.1964 (0.2100)  loss_objectness: 0.0105 (0.0118)  loss_rpn_box_reg: 0.0212 (0.0257)  time: 1.4936  data: 0.7303  max mem: 8871
Epoch: [13]  [20/70]  eta: 0:01:14  lr: 0.000001  loss: 0.3092 (0.3486)  loss_classifier: 0.0781 (0.0898)  loss_box_reg: 0.1968 (0.2198)  loss_objectness: 0.0100 (0.0121)  loss_rpn_box_reg: 0.0212 (0.0268)  time: 1.4977  data: 0.7473  max mem: 8871
Epoch: [13]  [30/70]  eta: 0:01:00  lr: 0.000001  loss: 0.3263 (0.3402)  loss_classifier: 0.0768 (0.0863)  loss_box_reg: 0.2145 (0.2156)  loss_objectness: 0.0100 (0.0119)  loss_rpn_box_reg: 0.0244 (0.0263)  time: 1.5169  data: 0.7694  max mem: 8871
Epoch: [13]  [40/70]  eta: 0:00:45  lr: 0.000001  loss: 0.3263 (0.3433)  loss_classifier: 0.0789 (0.0880)  loss_box_reg: 0.2145 (0.2176)  loss_objectness: 0.0106 (0.0123)  loss_rpn_box_reg: 0.0244 (0.0254)  time: 1.5229  data: 0.7667  max mem: 8871
Epoch: [13]  [50/70]  eta: 0:00:30  lr: 0.000001  loss: 0.3468 (0.3415)  loss_classifier: 0.0921 (0.0885)  loss_box_reg: 0.2125 (0.2167)  loss_objectness: 0.0108 (0.0118)  loss_rpn_box_reg: 0.0214 (0.0245)  time: 1.4997  data: 0.7330  max mem: 8871
Epoch: [13]  [60/70]  eta: 0:00:15  lr: 0.000001  loss: 0.3179 (0.3422)  loss_classifier: 0.0832 (0.0884)  loss_box_reg: 0.2015 (0.2161)  loss_objectness: 0.0114 (0.0127)  loss_rpn_box_reg: 0.0214 (0.0251)  time: 1.5108  data: 0.7417  max mem: 8871
Epoch: [13]  [69/70]  eta: 0:00:01  lr: 0.000001  loss: 0.3561 (0.3453)  loss_classifier: 0.0874 (0.0890)  loss_box_reg: 0.2259 (0.2185)  loss_objectness: 0.0115 (0.0125)  loss_rpn_box_reg: 0.0242 (0.0254)  time: 1.5173  data: 0.7740  max mem: 8871
Epoch: [13] Total time: 0:01:45 (1.5077 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3890 (0.3890)  evaluator_time: 0.9192 (0.9192)  time: 2.7884  data: 1.4784  max mem: 8871
Epoch:  38%|███▊      | 5/13 [15:34<20:48, 156.06s/epoch, Train Loss=0.345, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.956]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3779 (0.3751)  evaluator_time: 0.1914 (0.2565)  time: 1.7245  data: 1.0910  max mem: 8871
Test: Total time: 0:00:31 (1.7246 s / it)
Averaged stats: model_time: 0.3779 (0.3751)  evaluator_time: 0.1914 (0.2565)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.586
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.956
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.666
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.020
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.653
Epoch:  46%|████▌     | 6/13 [15:34<18:14, 156.43s/epoch, Train Loss=0.345, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.956]
Epoch: [14]  [ 0/70]  eta: 0:01:31  lr: 0.000001  loss: 0.3619 (0.3619)  loss_classifier: 0.1108 (0.1108)  loss_box_reg: 0.2242 (0.2242)  loss_objectness: 0.0100 (0.0100)  loss_rpn_box_reg: 0.0169 (0.0169)  time: 1.3065  data: 0.5145  max mem: 8871
Epoch: [14]  [10/70]  eta: 0:01:30  lr: 0.000001  loss: 0.3114 (0.3298)  loss_classifier: 0.0819 (0.0840)  loss_box_reg: 0.2012 (0.2153)  loss_objectness: 0.0089 (0.0113)  loss_rpn_box_reg: 0.0171 (0.0191)  time: 1.5075  data: 0.7318  max mem: 8871
Epoch: [14]  [20/70]  eta: 0:01:16  lr: 0.000001  loss: 0.3254 (0.3336)  loss_classifier: 0.0820 (0.0868)  loss_box_reg: 0.2156 (0.2143)  loss_objectness: 0.0090 (0.0113)  loss_rpn_box_reg: 0.0194 (0.0212)  time: 1.5399  data: 0.7712  max mem: 8871
Epoch: [14]  [30/70]  eta: 0:01:00  lr: 0.000001  loss: 0.3351 (0.3382)  loss_classifier: 0.0835 (0.0871)  loss_box_reg: 0.2187 (0.2162)  loss_objectness: 0.0095 (0.0110)  loss_rpn_box_reg: 0.0231 (0.0240)  time: 1.5245  data: 0.7548  max mem: 8871
Epoch: [14]  [40/70]  eta: 0:00:45  lr: 0.000001  loss: 0.3327 (0.3414)  loss_classifier: 0.0871 (0.0874)  loss_box_reg: 0.2159 (0.2176)  loss_objectness: 0.0095 (0.0114)  loss_rpn_box_reg: 0.0249 (0.0250)  time: 1.5211  data: 0.7564  max mem: 8871
Epoch: [14]  [50/70]  eta: 0:00:30  lr: 0.000001  loss: 0.3327 (0.3394)  loss_classifier: 0.0900 (0.0881)  loss_box_reg: 0.2159 (0.2146)  loss_objectness: 0.0095 (0.0113)  loss_rpn_box_reg: 0.0249 (0.0253)  time: 1.5364  data: 0.7663  max mem: 8871
Epoch: [14]  [60/70]  eta: 0:00:15  lr: 0.000001  loss: 0.3378 (0.3392)  loss_classifier: 0.0879 (0.0880)  loss_box_reg: 0.2215 (0.2147)  loss_objectness: 0.0081 (0.0111)  loss_rpn_box_reg: 0.0234 (0.0254)  time: 1.5254  data: 0.7506  max mem: 8871
Epoch: [14]  [69/70]  eta: 0:00:01  lr: 0.000001  loss: 0.3491 (0.3435)  loss_classifier: 0.0827 (0.0884)  loss_box_reg: 0.2303 (0.2180)  loss_objectness: 0.0091 (0.0115)  loss_rpn_box_reg: 0.0258 (0.0257)  time: 1.5317  data: 0.7875  max mem: 8871
Epoch: [14] Total time: 0:01:46 (1.5256 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3874 (0.3874)  evaluator_time: 0.9244 (0.9244)  time: 2.7978  data: 1.4843  max mem: 8871
Epoch:  46%|████▌     | 6/13 [18:13<18:14, 156.43s/epoch, Train Loss=0.343, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3783 (0.3749)  evaluator_time: 0.1930 (0.2536)  time: 1.7285  data: 1.0981  max mem: 8871
Test: Total time: 0:00:31 (1.7286 s / it)
Averaged stats: model_time: 0.3783 (0.3749)  evaluator_time: 0.1930 (0.2536)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.586
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.668
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.653
Epoch:  54%|█████▍    | 7/13 [18:13<15:43, 157.21s/epoch, Train Loss=0.343, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [15]  [ 0/70]  eta: 0:01:48  lr: 0.000001  loss: 0.3306 (0.3306)  loss_classifier: 0.0763 (0.0763)  loss_box_reg: 0.2297 (0.2297)  loss_objectness: 0.0045 (0.0045)  loss_rpn_box_reg: 0.0202 (0.0202)  time: 1.5510  data: 0.7554  max mem: 8871
Epoch: [15]  [10/70]  eta: 0:01:34  lr: 0.000001  loss: 0.3840 (0.3767)  loss_classifier: 0.0804 (0.0943)  loss_box_reg: 0.2438 (0.2370)  loss_objectness: 0.0136 (0.0140)  loss_rpn_box_reg: 0.0269 (0.0315)  time: 1.5747  data: 0.8087  max mem: 8871
Epoch: [15]  [20/70]  eta: 0:01:16  lr: 0.000001  loss: 0.3347 (0.3478)  loss_classifier: 0.0817 (0.0891)  loss_box_reg: 0.2273 (0.2202)  loss_objectness: 0.0117 (0.0122)  loss_rpn_box_reg: 0.0205 (0.0264)  time: 1.5345  data: 0.7662  max mem: 8871
Epoch: [15]  [30/70]  eta: 0:01:01  lr: 0.000001  loss: 0.3075 (0.3453)  loss_classifier: 0.0824 (0.0889)  loss_box_reg: 0.2066 (0.2172)  loss_objectness: 0.0097 (0.0129)  loss_rpn_box_reg: 0.0201 (0.0264)  time: 1.5190  data: 0.7502  max mem: 8871
Epoch: [15]  [40/70]  eta: 0:00:46  lr: 0.000001  loss: 0.3294 (0.3470)  loss_classifier: 0.0842 (0.0908)  loss_box_reg: 0.2127 (0.2178)  loss_objectness: 0.0095 (0.0121)  loss_rpn_box_reg: 0.0214 (0.0263)  time: 1.5559  data: 0.7806  max mem: 8871
Epoch: [15]  [50/70]  eta: 0:00:30  lr: 0.000001  loss: 0.3090 (0.3409)  loss_classifier: 0.0842 (0.0894)  loss_box_reg: 0.2002 (0.2146)  loss_objectness: 0.0082 (0.0113)  loss_rpn_box_reg: 0.0223 (0.0256)  time: 1.5162  data: 0.7302  max mem: 8871
Epoch: [15]  [60/70]  eta: 0:00:15  lr: 0.000001  loss: 0.3265 (0.3364)  loss_classifier: 0.0846 (0.0883)  loss_box_reg: 0.2074 (0.2123)  loss_objectness: 0.0083 (0.0110)  loss_rpn_box_reg: 0.0212 (0.0248)  time: 1.4796  data: 0.7112  max mem: 8871
Epoch: [15]  [69/70]  eta: 0:00:01  lr: 0.000001  loss: 0.3521 (0.3421)  loss_classifier: 0.0885 (0.0890)  loss_box_reg: 0.2249 (0.2158)  loss_objectness: 0.0098 (0.0117)  loss_rpn_box_reg: 0.0255 (0.0255)  time: 1.5015  data: 0.7568  max mem: 8871
Epoch: [15] Total time: 0:01:46 (1.5199 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3824 (0.3824)  evaluator_time: 0.9304 (0.9304)  time: 2.8024  data: 1.4879  max mem: 8871
Epoch:  54%|█████▍    | 7/13 [20:50<15:43, 157.21s/epoch, Train Loss=0.342, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3795 (0.3750)  evaluator_time: 0.1967 (0.2558)  time: 1.7161  data: 1.0834  max mem: 8871
Test: Total time: 0:00:30 (1.7162 s / it)
Averaged stats: model_time: 0.3795 (0.3750)  evaluator_time: 0.1967 (0.2558)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.586
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.667
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.020
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.129
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.654
Epoch:  62%|██████▏   | 8/13 [20:51<13:07, 157.42s/epoch, Train Loss=0.342, Test Loss=0.586, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [16]  [ 0/70]  eta: 0:01:42  lr: 0.000000  loss: 0.4488 (0.4488)  loss_classifier: 0.1038 (0.1038)  loss_box_reg: 0.2909 (0.2909)  loss_objectness: 0.0161 (0.0161)  loss_rpn_box_reg: 0.0379 (0.0379)  time: 1.4672  data: 0.6637  max mem: 8871
Epoch: [16]  [10/70]  eta: 0:01:31  lr: 0.000000  loss: 0.3422 (0.3188)  loss_classifier: 0.0819 (0.0799)  loss_box_reg: 0.2155 (0.2048)  loss_objectness: 0.0101 (0.0119)  loss_rpn_box_reg: 0.0225 (0.0222)  time: 1.5277  data: 0.7398  max mem: 8871
Epoch: [16]  [20/70]  eta: 0:01:17  lr: 0.000000  loss: 0.3119 (0.3235)  loss_classifier: 0.0756 (0.0865)  loss_box_reg: 0.2024 (0.2029)  loss_objectness: 0.0098 (0.0113)  loss_rpn_box_reg: 0.0187 (0.0228)  time: 1.5582  data: 0.7767  max mem: 8871
Epoch: [16]  [30/70]  eta: 0:01:01  lr: 0.000000  loss: 0.3059 (0.3259)  loss_classifier: 0.0791 (0.0852)  loss_box_reg: 0.2024 (0.2063)  loss_objectness: 0.0092 (0.0113)  loss_rpn_box_reg: 0.0206 (0.0232)  time: 1.5388  data: 0.7632  max mem: 8871
Epoch: [16]  [40/70]  eta: 0:00:45  lr: 0.000000  loss: 0.3412 (0.3365)  loss_classifier: 0.0819 (0.0871)  loss_box_reg: 0.2198 (0.2133)  loss_objectness: 0.0099 (0.0117)  loss_rpn_box_reg: 0.0231 (0.0244)  time: 1.4884  data: 0.7133  max mem: 8871
Epoch: [16]  [50/70]  eta: 0:00:30  lr: 0.000000  loss: 0.3532 (0.3332)  loss_classifier: 0.0819 (0.0861)  loss_box_reg: 0.2237 (0.2119)  loss_objectness: 0.0099 (0.0112)  loss_rpn_box_reg: 0.0251 (0.0240)  time: 1.5233  data: 0.7536  max mem: 8871
Epoch: [16]  [60/70]  eta: 0:00:15  lr: 0.000000  loss: 0.3523 (0.3382)  loss_classifier: 0.0849 (0.0876)  loss_box_reg: 0.2206 (0.2141)  loss_objectness: 0.0099 (0.0113)  loss_rpn_box_reg: 0.0228 (0.0253)  time: 1.5466  data: 0.7884  max mem: 8871
Epoch: [16]  [69/70]  eta: 0:00:01  lr: 0.000000  loss: 0.3523 (0.3421)  loss_classifier: 0.0836 (0.0885)  loss_box_reg: 0.2261 (0.2166)  loss_objectness: 0.0111 (0.0114)  loss_rpn_box_reg: 0.0256 (0.0255)  time: 1.4863  data: 0.7431  max mem: 8871
Epoch: [16] Total time: 0:01:46 (1.5195 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3858 (0.3858)  evaluator_time: 0.9365 (0.9365)  time: 2.8080  data: 1.4839  max mem: 8871
Epoch:  62%|██████▏   | 8/13 [23:28<13:07, 157.42s/epoch, Train Loss=0.342, Test Loss=0.587, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3786 (0.3747)  evaluator_time: 0.1912 (0.2569)  time: 1.7087  data: 1.0752  max mem: 8871
Test: Total time: 0:00:30 (1.7088 s / it)
Averaged stats: model_time: 0.3786 (0.3747)  evaluator_time: 0.1912 (0.2569)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.587
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.669
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.654
Epoch:  69%|██████▉   | 9/13 [23:29<10:30, 157.54s/epoch, Train Loss=0.342, Test Loss=0.587, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [17]  [ 0/70]  eta: 0:01:35  lr: 0.000000  loss: 0.2973 (0.2973)  loss_classifier: 0.0738 (0.0738)  loss_box_reg: 0.1960 (0.1960)  loss_objectness: 0.0039 (0.0039)  loss_rpn_box_reg: 0.0235 (0.0235)  time: 1.3574  data: 0.6805  max mem: 8871
Epoch: [17]  [10/70]  eta: 0:01:29  lr: 0.000000  loss: 0.3502 (0.3682)  loss_classifier: 0.0789 (0.0866)  loss_box_reg: 0.2384 (0.2409)  loss_objectness: 0.0072 (0.0100)  loss_rpn_box_reg: 0.0271 (0.0306)  time: 1.4979  data: 0.7532  max mem: 8871
Epoch: [17]  [20/70]  eta: 0:01:17  lr: 0.000000  loss: 0.3823 (0.3702)  loss_classifier: 0.0885 (0.0908)  loss_box_reg: 0.2377 (0.2362)  loss_objectness: 0.0108 (0.0117)  loss_rpn_box_reg: 0.0286 (0.0314)  time: 1.5563  data: 0.8018  max mem: 8871
Epoch: [17]  [30/70]  eta: 0:01:01  lr: 0.000000  loss: 0.3507 (0.3530)  loss_classifier: 0.0885 (0.0886)  loss_box_reg: 0.2193 (0.2261)  loss_objectness: 0.0099 (0.0109)  loss_rpn_box_reg: 0.0241 (0.0274)  time: 1.5430  data: 0.7827  max mem: 8872
Epoch: [17]  [40/70]  eta: 0:00:45  lr: 0.000000  loss: 0.3372 (0.3522)  loss_classifier: 0.0885 (0.0897)  loss_box_reg: 0.2099 (0.2243)  loss_objectness: 0.0102 (0.0115)  loss_rpn_box_reg: 0.0200 (0.0267)  time: 1.4949  data: 0.7256  max mem: 8872
Epoch: [17]  [50/70]  eta: 0:00:30  lr: 0.000000  loss: 0.3385 (0.3493)  loss_classifier: 0.0900 (0.0891)  loss_box_reg: 0.2105 (0.2226)  loss_objectness: 0.0102 (0.0116)  loss_rpn_box_reg: 0.0247 (0.0260)  time: 1.4963  data: 0.7275  max mem: 8872
Epoch: [17]  [60/70]  eta: 0:00:15  lr: 0.000000  loss: 0.3215 (0.3450)  loss_classifier: 0.0834 (0.0883)  loss_box_reg: 0.2049 (0.2194)  loss_objectness: 0.0090 (0.0115)  loss_rpn_box_reg: 0.0200 (0.0257)  time: 1.4930  data: 0.7185  max mem: 8872
Epoch: [17]  [69/70]  eta: 0:00:01  lr: 0.000000  loss: 0.3293 (0.3426)  loss_classifier: 0.0866 (0.0883)  loss_box_reg: 0.2173 (0.2172)  loss_objectness: 0.0092 (0.0116)  loss_rpn_box_reg: 0.0209 (0.0255)  time: 1.4855  data: 0.7083  max mem: 8872
Epoch: [17] Total time: 0:01:45 (1.5063 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3869 (0.3869)  evaluator_time: 0.9268 (0.9268)  time: 2.7971  data: 1.4815  max mem: 8872
Epoch:  69%|██████▉   | 9/13 [26:05<10:30, 157.54s/epoch, Train Loss=0.343, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3797 (0.3748)  evaluator_time: 0.1995 (0.2570)  time: 1.7098  data: 1.0760  max mem: 8872
Test: Total time: 0:00:30 (1.7099 s / it)
Averaged stats: model_time: 0.3797 (0.3748)  evaluator_time: 0.1995 (0.2570)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.588
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.958
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.672
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.193
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.157
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.654
Epoch:  77%|███████▋  | 10/13 [26:05<07:51, 157.17s/epoch, Train Loss=0.343, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Epoch: [18]  [ 0/70]  eta: 0:01:49  lr: 0.000000  loss: 0.2253 (0.2253)  loss_classifier: 0.0609 (0.0609)  loss_box_reg: 0.1479 (0.1479)  loss_objectness: 0.0060 (0.0060)  loss_rpn_box_reg: 0.0105 (0.0105)  time: 1.5623  data: 0.7678  max mem: 8872
Epoch: [18]  [10/70]  eta: 0:01:33  lr: 0.000000  loss: 0.3159 (0.3345)  loss_classifier: 0.0759 (0.0859)  loss_box_reg: 0.2072 (0.2153)  loss_objectness: 0.0069 (0.0096)  loss_rpn_box_reg: 0.0215 (0.0237)  time: 1.5570  data: 0.7803  max mem: 8872
Epoch: [18]  [20/70]  eta: 0:01:17  lr: 0.000000  loss: 0.3325 (0.3414)  loss_classifier: 0.0778 (0.0861)  loss_box_reg: 0.2206 (0.2223)  loss_objectness: 0.0074 (0.0089)  loss_rpn_box_reg: 0.0235 (0.0241)  time: 1.5436  data: 0.7681  max mem: 8872
Epoch: [18]  [30/70]  eta: 0:01:01  lr: 0.000000  loss: 0.3483 (0.3530)  loss_classifier: 0.0893 (0.0902)  loss_box_reg: 0.2330 (0.2284)  loss_objectness: 0.0082 (0.0099)  loss_rpn_box_reg: 0.0237 (0.0244)  time: 1.5343  data: 0.7578  max mem: 8872
Epoch: [18]  [40/70]  eta: 0:00:46  lr: 0.000000  loss: 0.3463 (0.3418)  loss_classifier: 0.0819 (0.0869)  loss_box_reg: 0.2231 (0.2214)  loss_objectness: 0.0093 (0.0100)  loss_rpn_box_reg: 0.0225 (0.0234)  time: 1.5309  data: 0.7496  max mem: 8872
Epoch: [18]  [50/70]  eta: 0:00:30  lr: 0.000000  loss: 0.3161 (0.3434)  loss_classifier: 0.0851 (0.0891)  loss_box_reg: 0.2045 (0.2196)  loss_objectness: 0.0092 (0.0106)  loss_rpn_box_reg: 0.0223 (0.0242)  time: 1.5037  data: 0.7290  max mem: 8872
Epoch: [18]  [60/70]  eta: 0:00:15  lr: 0.000000  loss: 0.3315 (0.3448)  loss_classifier: 0.0884 (0.0885)  loss_box_reg: 0.2139 (0.2199)  loss_objectness: 0.0103 (0.0109)  loss_rpn_box_reg: 0.0240 (0.0255)  time: 1.5189  data: 0.7721  max mem: 8872
Epoch: [18]  [69/70]  eta: 0:00:01  lr: 0.000000  loss: 0.3180 (0.3393)  loss_classifier: 0.0795 (0.0871)  loss_box_reg: 0.2078 (0.2161)  loss_objectness: 0.0124 (0.0111)  loss_rpn_box_reg: 0.0237 (0.0250)  time: 1.5019  data: 0.7584  max mem: 8872
Epoch: [18] Total time: 0:01:46 (1.5230 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3903 (0.3903)  evaluator_time: 0.9237 (0.9237)  time: 2.7991  data: 1.4833  max mem: 8872
Epoch:  77%|███████▋  | 10/13 [28:42<07:51, 157.17s/epoch, Train Loss=0.339, Test Loss=0.587, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3788 (0.3753)  evaluator_time: 0.1986 (0.2583)  time: 1.7063  data: 1.0707  max mem: 8872
Test: Total time: 0:00:30 (1.7064 s / it)
Averaged stats: model_time: 0.3788 (0.3753)  evaluator_time: 0.1986 (0.2583)
Accumulating evaluation results...
DONE (t=0.05s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.587
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.666
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.143
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.653
Epoch:  85%|████████▍ | 11/13 [28:43<05:14, 157.23s/epoch, Train Loss=0.339, Test Loss=0.587, Train Accuracy=0.000, Test Accuracy=0.957]
Epoch: [19]  [ 0/70]  eta: 0:01:38  lr: 0.000000  loss: 0.3407 (0.3407)  loss_classifier: 0.0735 (0.0735)  loss_box_reg: 0.2373 (0.2373)  loss_objectness: 0.0075 (0.0075)  loss_rpn_box_reg: 0.0224 (0.0224)  time: 1.4033  data: 0.7253  max mem: 8872
Epoch: [19]  [10/70]  eta: 0:01:31  lr: 0.000000  loss: 0.3407 (0.3511)  loss_classifier: 0.0915 (0.0943)  loss_box_reg: 0.2314 (0.2155)  loss_objectness: 0.0126 (0.0135)  loss_rpn_box_reg: 0.0224 (0.0278)  time: 1.5289  data: 0.7433  max mem: 8872
Epoch: [19]  [20/70]  eta: 0:01:15  lr: 0.000000  loss: 0.3197 (0.3354)  loss_classifier: 0.0845 (0.0894)  loss_box_reg: 0.1985 (0.2082)  loss_objectness: 0.0106 (0.0129)  loss_rpn_box_reg: 0.0214 (0.0250)  time: 1.5180  data: 0.7380  max mem: 8872
Epoch: [19]  [30/70]  eta: 0:00:59  lr: 0.000000  loss: 0.3197 (0.3373)  loss_classifier: 0.0845 (0.0872)  loss_box_reg: 0.1985 (0.2124)  loss_objectness: 0.0106 (0.0128)  loss_rpn_box_reg: 0.0221 (0.0249)  time: 1.4733  data: 0.7322  max mem: 8872
Epoch: [19]  [40/70]  eta: 0:00:44  lr: 0.000000  loss: 0.3372 (0.3393)  loss_classifier: 0.0903 (0.0880)  loss_box_reg: 0.2076 (0.2124)  loss_objectness: 0.0105 (0.0127)  loss_rpn_box_reg: 0.0225 (0.0262)  time: 1.4847  data: 0.7552  max mem: 8872
Epoch: [19]  [50/70]  eta: 0:00:30  lr: 0.000000  loss: 0.3301 (0.3382)  loss_classifier: 0.0903 (0.0875)  loss_box_reg: 0.2107 (0.2131)  loss_objectness: 0.0082 (0.0118)  loss_rpn_box_reg: 0.0223 (0.0257)  time: 1.5269  data: 0.7686  max mem: 8872
Epoch: [19]  [60/70]  eta: 0:00:15  lr: 0.000000  loss: 0.3426 (0.3444)  loss_classifier: 0.0894 (0.0885)  loss_box_reg: 0.2241 (0.2182)  loss_objectness: 0.0062 (0.0116)  loss_rpn_box_reg: 0.0245 (0.0261)  time: 1.5340  data: 0.7696  max mem: 8872
Epoch: [19]  [69/70]  eta: 0:00:01  lr: 0.000000  loss: 0.3244 (0.3426)  loss_classifier: 0.0856 (0.0878)  loss_box_reg: 0.2255 (0.2174)  loss_objectness: 0.0097 (0.0116)  loss_rpn_box_reg: 0.0216 (0.0259)  time: 1.4695  data: 0.7149  max mem: 8872
Epoch: [19] Total time: 0:01:44 (1.4971 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:50  model_time: 0.3944 (0.3944)  evaluator_time: 0.9351 (0.9351)  time: 2.8310  data: 1.4998  max mem: 8872
Epoch:  85%|████████▍ | 11/13 [31:17<05:14, 157.23s/epoch, Train Loss=0.343, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3790 (0.3752)  evaluator_time: 0.1979 (0.2577)  time: 1.7038  data: 1.0689  max mem: 8872
Test: Total time: 0:00:30 (1.7039 s / it)
Averaged stats: model_time: 0.3790 (0.3752)  evaluator_time: 0.1979 (0.2577)
Accumulating evaluation results...
DONE (t=0.07s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.588
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.958
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.672
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.193
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.653
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.157
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.654
Epoch:  92%|█████████▏| 12/13 [31:18<02:36, 156.70s/epoch, Train Loss=0.343, Test Loss=0.588, Train Accuracy=0.000, Test Accuracy=0.958]
Epoch: [20]  [ 0/70]  eta: 0:01:41  lr: 0.000000  loss: 0.3026 (0.3026)  loss_classifier: 0.0798 (0.0798)  loss_box_reg: 0.1961 (0.1961)  loss_objectness: 0.0052 (0.0052)  loss_rpn_box_reg: 0.0216 (0.0216)  time: 1.4486  data: 0.7687  max mem: 8872
Epoch: [20]  [10/70]  eta: 0:01:28  lr: 0.000000  loss: 0.3601 (0.3791)  loss_classifier: 0.1037 (0.1029)  loss_box_reg: 0.2468 (0.2322)  loss_objectness: 0.0171 (0.0155)  loss_rpn_box_reg: 0.0233 (0.0285)  time: 1.4833  data: 0.7282  max mem: 8872
Epoch: [20]  [20/70]  eta: 0:01:13  lr: 0.000000  loss: 0.3601 (0.3697)  loss_classifier: 0.0932 (0.0971)  loss_box_reg: 0.2348 (0.2312)  loss_objectness: 0.0112 (0.0140)  loss_rpn_box_reg: 0.0233 (0.0276)  time: 1.4723  data: 0.7154  max mem: 8872
Epoch: [20]  [30/70]  eta: 0:00:59  lr: 0.000000  loss: 0.3570 (0.3620)  loss_classifier: 0.0932 (0.0960)  loss_box_reg: 0.2294 (0.2258)  loss_objectness: 0.0097 (0.0133)  loss_rpn_box_reg: 0.0245 (0.0269)  time: 1.4959  data: 0.7383  max mem: 8872
Epoch: [20]  [40/70]  eta: 0:00:44  lr: 0.000000  loss: 0.3251 (0.3481)  loss_classifier: 0.0839 (0.0907)  loss_box_reg: 0.2174 (0.2203)  loss_objectness: 0.0084 (0.0121)  loss_rpn_box_reg: 0.0191 (0.0251)  time: 1.4873  data: 0.7123  max mem: 8872
Epoch: [20]  [50/70]  eta: 0:00:29  lr: 0.000000  loss: 0.2837 (0.3446)  loss_classifier: 0.0762 (0.0897)  loss_box_reg: 0.1852 (0.2177)  loss_objectness: 0.0088 (0.0123)  loss_rpn_box_reg: 0.0186 (0.0249)  time: 1.4540  data: 0.6847  max mem: 8872
Epoch: [20]  [60/70]  eta: 0:00:14  lr: 0.000000  loss: 0.3278 (0.3423)  loss_classifier: 0.0805 (0.0889)  loss_box_reg: 0.2088 (0.2168)  loss_objectness: 0.0105 (0.0118)  loss_rpn_box_reg: 0.0208 (0.0248)  time: 1.4974  data: 0.7438  max mem: 8872
Epoch: [20]  [69/70]  eta: 0:00:01  lr: 0.000000  loss: 0.3401 (0.3434)  loss_classifier: 0.0845 (0.0888)  loss_box_reg: 0.2242 (0.2175)  loss_objectness: 0.0082 (0.0117)  loss_rpn_box_reg: 0.0230 (0.0254)  time: 1.4569  data: 0.7066  max mem: 8872
Epoch: [20] Total time: 0:01:43 (1.4736 s / it)
creating index...
index created!
Test:  [ 0/18]  eta: 0:00:49  model_time: 0.3931 (0.3931)  evaluator_time: 0.9160 (0.9160)  time: 2.7570  data: 1.4462  max mem: 8872
Epoch:  92%|█████████▏| 12/13 [33:50<02:36, 156.70s/epoch, Train Loss=0.343, Test Loss=0.585, Train Accuracy=0.000, Test Accuracy=0.957]
Test:  [17/18]  eta: 0:00:01  model_time: 0.3771 (0.3744)  evaluator_time: 0.1945 (0.2552)  time: 1.6764  data: 1.0449  max mem: 8872
Test: Total time: 0:00:30 (1.6765 s / it)
Averaged stats: model_time: 0.3771 (0.3744)  evaluator_time: 0.1945 (0.2552)
Accumulating evaluation results...
DONE (t=0.06s).
IoU metric: bbox
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.585
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.957
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.662
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.021
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.192
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.651
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.157
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.652
Epoch: 100%|██████████| 13/13 [33:51<00:00, 156.28s/epoch, Train Loss=0.343, Test Loss=0.585, Train Accuracy=0.000, Test Accuracy=0.957]
 
Finished Training

Verificacion de resultados¶

🔮 Función auxiliar:

Imprime los resultados de un modelo.

In [40]:
def plot_results(train_losses: List[float], test_losses: List[float], train_accs: List[float], test_accs: List[float], title):
    plt.figure(figsize=(10, 5))
    plt.subplot(121)
    plt.plot(train_losses, 'r')
    plt.plot(test_losses, 'g')
    plt.title('Loss')
    plt.legend(['train', 'test'])
    plt.xlabel('Epoch')
    plt.ylabel('Loss')
    plt.grid(True)

    plt.subplot(122)
    plt.plot(train_accs, 'r')
    plt.plot(test_accs, 'g')
    plt.title('Accuracy')
    plt.legend(['train', 'test'])
    plt.xlabel('Epoch')
    plt.ylabel('Accuracy')
    plt.grid(True)

    plt.suptitle(title)
    plt.show()

Verificamos los resultados:

In [41]:
# Cargar el archivo JSON
results_json_path = os.path.join(RUTA_CARPETA_MODELS, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID + '_results.json')
with open(results_json_path, 'r') as file:
    data = json.load(file)

# Asignar variables
train_losses = data['train_losses']
test_losses = data['test_losses']
train_accs = data['train_accs']
test_accs = data['test_accs']
best_test_acc = data['best_test_acc']

plot_results(train_losses, test_losses, train_accs, test_accs, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID)
No description has been provided for this image
In [42]:
# Cargar el archivo JSON
results_json_path = os.path.join(RUTA_CARPETA_MODELS, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG + '_results.json')
with open(results_json_path, 'r') as file:
    data = json.load(file)

# Asignar variables
train_losses = data['train_losses']
test_losses = data['test_losses']
train_accs = data['train_accs']
test_accs = data['test_accs']
best_test_acc = data['best_test_acc']

plot_results(train_losses, test_losses, train_accs, test_accs, FASTERRCNN_RESNET50_FPN_V2_MODEL_ID_SIN_AUG)
No description has been provided for this image

Prueba de inferencia¶

Podemos probar como fue nuestro modelo en una imagen:

In [43]:
image, _ = test_dataset[0]

plt.figure()
plt.imshow(image.permute(1, 2, 0))
Out[43]:
<matplotlib.image.AxesImage at 0x7e5812d1f790>
No description has been provided for this image

Y las predicciones son:

In [45]:
# Definir el umbral para mostrar las predicciones (por ejemplo, 0.5)
score_threshold = 0.5

# Realizar una inferencia en modo de evaluación
model.eval()

# Seleccionar una imagen y un target del conjunto de prueba
image, target = test_dataset[0]  # El targets se utiliza para en nombre de la imagen.

model.to(DEVICE)
image = image.to(DEVICE)

# Realizar la predicción
with torch.no_grad():
    # Pasar la imagen en una lista
    prediction = model([image])[0]

# Filtrar las cajas, etiquetas y scores en función del umbral
predicted_boxes = prediction["boxes"].cpu()
predicted_scores = prediction["scores"].cpu()
predicted_labels = prediction["labels"].cpu()
image = image.cpu()

# Aplicar el umbral
# indices = predicted_scores > score_threshold
indices = nms(predicted_boxes, predicted_scores, iou_threshold=score_threshold)
filtered_boxes = predicted_boxes[indices]
filtered_labels = [
    f"{list(LABELS.keys())[list(LABELS.values()).index(label)]} ({score:.2f})"
    for label, score in zip(predicted_labels[indices], predicted_scores[indices])
]

# Configuración para visualización
config = {
    "linewidth": 1,
    "edgecolor": "r",
    "color": "white",
    "fontsize": 9,
    "backgroundcolor": "red",
    "putText": True,
    "show_image_name": True,
    "super_title": "Predicciones del Modelo",
    "super_title_fontsize": 16,
    "super_title_y": 0.90,
    "from_dataset_voc": True,
}

# Verificar si "image_id" está en el target; si no, usar un nombre genérico
target_for_visualization = {
    "boxes": filtered_boxes,
    "labels": filtered_labels,
    "img_path": target["img_path"]
}

# Visualizar la imagen con las predicciones
show_single_image(image, target_for_visualization, config)
No description has been provided for this image

También, se puede crear una función para utilizar en otro ambiente:

In [46]:
def show_model_prediction(
    model: torch.nn.Module,
    image_path: str,
    config: dict,
    get_transforms: Callable,
    device: str,
    score_threshold: float = 0.5,
) -> None:
    """
    Muestra las predicciones de un modelo Faster R-CNN en una imagen dada.

    Args:
        model (torch.nn.Module): El modelo de detección de objetos.
        image_path (str): Ruta de la imagen a procesar.
        config (dict): Configuración para el estilo de las cajas y texto.
        get_transforms (callable): Función que aplica las transformaciones necesarias a la imagen.
        score_threshold (float): Umbral mínimo para mostrar las predicciones.

    Returns:
        None
    """
    # Cargar la imagen
    image = read_image(image_path)

    # Aplicar las transformaciones a la imagen
    transforms = get_transforms(train=False)
    # image, _ = transforms(image, {})
    image = transforms(image)

    model.to(device)
    image = image.to(device)

    # Realizar la predicción
    model.eval()
    with torch.no_grad():
        prediction = model([image])[0]

    # Filtrar las cajas, etiquetas y scores en función del umbral
    predicted_boxes = prediction["boxes"].cpu()
    predicted_scores = prediction["scores"].cpu()
    predicted_labels = prediction["labels"].cpu()
    image = image.cpu()

    # Aplico el umbral
    indices = nms(predicted_boxes, predicted_scores, iou_threshold=score_threshold)
    filtered_boxes = predicted_boxes[indices]
    filtered_labels = [
        f"{list(LABELS.keys())[list(LABELS.values()).index(label)]} ({score:.2f})"
        for label, score in zip(predicted_labels[indices], predicted_scores[indices])
    ]

    # Preparar el target para la visualización
    target_for_visualization = {
        "boxes": filtered_boxes,
        "labels": filtered_labels,
        "img_path": image_path,
    }

    # Visualizar la imagen con las predicciones
    show_single_image(image, target_for_visualization, config)

Probamos predecir una imagen aleatoria:

In [47]:
image_path, _ = get_random_image_and_annotation(
    RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA, RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_LABELS
)

config = {
    "linewidth": 1,
    "edgecolor": "r",
    "color": "white",
    "fontsize": 9,
    "backgroundcolor": "red",
    "putText": True,
    "show_image_name": True,
    "super_title": "Predicciones de para una imagen aleatoria",
    "super_title_fontsize": 16,
    "super_title_y": 0.85,
    "from_dataset_voc": True,
}

show_model_prediction(model, image_path, config, get_transforms, DEVICE, 0.5)
No description has been provided for this image

YOLO¶

El algoritmo de una sola capa para detección de objetos en tiempo real YOLO o You Only Look Once, utiliza una única red neuronal convolucional para detectar objetos en imágenes. Existen múltiples versiones, siendo una de las últimas versiones la YOLOv8 (2023). Se utilizará esta versión y se hará una pequeña comparación, a modo de curiosidad, con la versión más reciente YOLOv11 (2024).

🤖 Consulta a IA (ChatGPT 4o): 🤖

❔ Prompt:

Podrías generar una introducción sobre el modelo YOLOv11? Que incluya sus componentes, sus ventajas y desventajas, así como también comparalo con otros modelos del estado del arte.

🔃 Respuesta:

Introducción al Modelo YOLOv11

El modelo YOLOv11 (You Only Look Once, versión 11) representa la última evolución en la familia de algoritmos YOLO para la detección de objetos en tiempo real. A lo largo de las versiones previas, YOLO ha sido ampliamente reconocido por su enfoque unificado y eficiente para detectar múltiples objetos en imágenes con alta velocidad, lo que lo ha convertido en uno de los métodos más populares en aplicaciones de visión por computadora, como sistemas de seguridad, automóviles autónomos y más. La versión YOLOv11 introduce varias mejoras en cuanto a arquitectura, rendimiento y precisión. Componentes principales de YOLOv11

  1. Backbone: La columna vertebral de YOLOv11 es una red neuronal convolucional (CNN) mejorada, diseñada específicamente para extraer características profundas de las imágenes de entrada. Esta nueva versión incluye arquitecturas más eficientes como una variación de CSPDarknet, que mejora la capacidad de aprendizaje con menos parámetros y una mejor gestión del ancho de banda de la red.

  2. Neck: En YOLOv11, el Neck ha sido optimizado para mejorar la capacidad de capturar características en diferentes escalas. Utiliza un módulo FPN (Feature Pyramid Network) y PAN (Path Aggregation Network) más avanzados, permitiendo una detección más precisa de objetos pequeños y difíciles de reconocer.

  3. Head: El Head del modelo, encargado de realizar las predicciones finales de clase y caja de contorno, ha sido refinado para ser más eficiente. Se ha mejorado el uso de las anclas, haciéndolo más adaptable a la detección de objetos de diversos tamaños.

  4. Mecanismos de atención: YOLOv11 incorpora nuevas técnicas de atención que permiten enfocar la red en las áreas más relevantes de la imagen, mejorando la precisión sin aumentar significativamente la carga computacional.

Ventajas de YOLOv11

  • Velocidad: Al igual que sus predecesores, YOLOv11 sobresale en la detección de objetos en tiempo real. El modelo está optimizado para realizar predicciones con una sola pasada sobre la imagen, lo que lo convierte en uno de los modelos más rápidos disponibles para aplicaciones en tiempo real.
  • Precisión mejorada: YOLOv11 presenta una mejora significativa en precisión, especialmente para objetos pequeños o en escenarios complejos con superposición de objetos, gracias a su arquitectura avanzada de extracción de características.
  • Escalabilidad: La modularidad de su diseño permite adaptarlo a diferentes necesidades, desde implementaciones en dispositivos con limitaciones de recursos hasta aplicaciones más potentes.

Desventajas de YOLOv11

  • Mayor complejidad computacional: A pesar de los avances en velocidad y precisión, YOLOv11 requiere más capacidad computacional que versiones anteriores debido a su red más profunda y la inclusión de nuevas capas de atención.
  • No tan preciso como otros modelos en objetos pequeños: Aunque ha mejorado en este aspecto, todavía puede tener dificultades para detectar objetos extremadamente pequeños en comparación con modelos más especializados en este tipo de tareas, como RetinaNet o EfficientDet.

Comparación con otros modelos del estado del arte

  1. YOLOv11 vs. EfficientDet: EfficientDet se destaca por su balance entre precisión y eficiencia gracias a su escalabilidad. Si bien YOLOv11 es más rápido en detecciones en tiempo real, EfficientDet logra mejores resultados en tareas que requieren alta precisión, como la detección de objetos pequeños en imágenes de alta resolución. Sin embargo, EfficientDet puede ser más lento en implementaciones donde la velocidad es crítica.
  2. YOLOv11 vs. RetinaNet: RetinaNet es conocido por su capacidad para manejar problemas de desbalance de clases con su innovadora Focal Loss, lo que lo convierte en una opción sólida para detectar objetos pequeños o en escenas con alta densidad de objetos. Aunque YOLOv11 ha cerrado la brecha en este aspecto, RetinaNet sigue siendo más adecuado para escenarios en los que la precisión es más importante que la velocidad.
  3. YOLOv11 vs. DETR (Detection Transformer): DETR ha revolucionado el campo con su enfoque basado en transformadores para la detección de objetos, eliminando la necesidad de anclas y logrando un rendimiento impresionante en tareas complejas. Sin embargo, DETR tiende a ser más lento que YOLOv11 y requiere más recursos computacionales. YOLOv11 es preferible en aplicaciones donde se necesita un alto rendimiento en tiempo real con un hardware menos potente.

Conclusión

YOLOv11 marca un avance significativo en la familia YOLO, equilibrando velocidad y precisión para adaptarse a una amplia gama de aplicaciones de visión por computadora. Aunque enfrenta una competencia considerable en el estado del arte de la detección de objetos, sigue siendo una opción sobresaliente para escenarios en los que el tiempo de inferencia rápido es crucial. Las mejoras en la arquitectura del modelo, junto con su capacidad de manejar diferentes tamaños de objetos y escenas complejas, lo posicionan como una herramienta versátil en el campo de la inteligencia artificial aplicada.

Figura 4: Arquitectura YOLOv10 para referencia. No se encontró disponible la arquitectura de YOLOv11 aún. Extraída de: https://blog.roboflow.com/what-is-yolov10
Figura 4: Arquitectura YOLOv10 para referencia. No se encontró disponible la arquitectura de YOLOv11 aún. Extraída de: https://blog.roboflow.com/what-is-yolov10

Preparar el dataset¶

El formato de una etiqueta de YOLOv8 es: <class_id> <x_center> <y_center> <width> <height>. El dataset de imágenes que se utiliza contiene un par de archivos en formato .csv con los datos de las etiquetas en cada una de las imágenes; así mismo, éstas se encuentran divididas en dos conjuntos: train y test.

Como el conjunto de imágenes ya se encuentra dividido, para mantener la coherencia entre los modelos, se decide utilizar la misma división existente.

In [12]:
# Etiquetas:
LABELS = {
    'Tree' : 0,
    'Palm' : 1
}

NOMBRE_MODELO_YOLO = 'palmeras'
RUTA_CARPETA_YOLO_PROYECTOS = 'yolo'
RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN = os.path.join(RUTA_CARPETA_YOLO_PROYECTOS, YOLOV8_MODEL_ID, 'train')
RUTA_CARPETA_YOLOV8_PROYECTOS_VAL = os.path.join(RUTA_CARPETA_YOLO_PROYECTOS, YOLOV8_MODEL_ID, 'val')

RUTA_CARPETA_YOLOV11_PROYECTOS_TRAIN = os.path.join(RUTA_CARPETA_YOLO_PROYECTOS, YOLOV11_MODEL_ID, 'train')
RUTA_CARPETA_YOLOV11_PROYECTOS_VAL = os.path.join(RUTA_CARPETA_YOLO_PROYECTOS, YOLOV11_MODEL_ID, 'val')

#RUTA_CARPETA_YOLOV8_PROYECTOS_BEST_TRAINED = os.path.join(RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN, NOMBRE_MODELO_YOLO, 'weights', 'best.pt')
In [ ]:
train_df = pd.read_csv(RUTA_CARPETA_KAGGLE_DATSET_YOLO_TRAIN_CSV)
test_df = pd.read_csv(RUTA_CARPETA_KAGGLE_DATSET_YOLO_TEST_CSV)

# Se imprime el head de los dataframes para verificar que se cargaron correctamente
print(train_df.head())
print(test_df.head())

# Veamos las clases disponibles
print('--- Clases disponibles ---')
print('En Train: ', train_df['class'].unique())
print('En Test: ', test_df['class'].unique())
                        filename  width  height class  xmin  ymin  xmax  ymax
0  ck2oh0xx2sthb0838g3pmr4ky.jpg   4000    3000  Palm  1147  1694  1337  1863
1  ck2oh0xx2sthb0838g3pmr4ky.jpg   4000    3000  Palm  1392  1725  1567  1918
2  ck2oh0xx2sthb0838g3pmr4ky.jpg   4000    3000  Palm  1645  1816  1756  1933
3  ck2oh0xx2sthb0838g3pmr4ky.jpg   4000    3000  Palm  1913  1793  2104  1966
4  ck2oh0xx2sthb0838g3pmr4ky.jpg   4000    3000  Palm  2270  1817  2488  2034
                        filename  width  height class  xmin  ymin  xmax  ymax
0  ck2jfuaiwocqi07256rg8xca1.jpg   4864    3648  Tree  1978  1084  2349  1414
1  ck2jfuaiwocqi07256rg8xca1.jpg   4864    3648  Tree  2295  1947  2727  2465
2  ck2jfuaiwocqi07256rg8xca1.jpg   4864    3648  Tree  1358  1765  1912  2357
3  ck2jfuaiwocqi07256rg8xca1.jpg   4864    3648  Tree     0  2479   380  2981
4  ck2jfuaiwocqi07256rg8xca1.jpg   4864    3648  Tree  1664  2740  2268  3264
--- Clases disponibles ---
En Train:  ['Palm' 'Tree']
En Test:  ['Tree' 'Palm']

Se define una función para convertir en formato YOLO los datos guardados en los archivos CSV a formato aceptado por YOLO (.txt).

In [ ]:
# Función para convertir las coordenadas de las cajas de anotación a formato YOLO
def convert_to_yolo_format(row):
    """
    Convierte las coordenadas de las cajas de anotación a formato YOLO.

    Args:
        row (pd.Series): Fila del DataFrame con las anotaciones.

    Returns:
        tuple: Una tupla con los datos en formato YOLO.

    """
    try:
        image_width = row['width']
        image_height = row['height']
        x_center = (row['xmin'] + row['xmax']) / 2.0 / image_width
        y_center = (row['ymin'] + row['ymax']) / 2.0 / image_height
        width = (row['xmax'] - row['xmin']) / image_width
        height = (row['ymax'] - row['ymin']) / image_height
        class_id = LABELS.get(row['class'], None)
        if class_id is None:
            print(f"Clase no encontrada: <{row['class']}> en la fila <{row}>")
            return None
        return class_id, x_center, y_center, width, height
    except KeyError as e:
        print(f"Falta un dato en la fila: {e}")
        return None
    except TypeError:
        print("Datos invalidos encontrados.")
        return None

# Función para guardar las anotaciones en formato YOLO
def save_yolo_annotations(df, output_dir):
    """
    Guarda las anotaciones en formato YOLO.

    Args:
        df (pd.DataFrame): DataFrame con las anotaciones.
        output_dir (str): Directorio de salida.
    
    Returns:
        None
    
    """
    if not os.path.exists(output_dir):
        os.makedirs(output_dir)
    
    for _, row in df.iterrows():
        yolo_data = convert_to_yolo_format(row)
        filename = os.path.splitext(row['filename'])[0] + '.txt'
        output_path = os.path.join(output_dir, filename)
        
        if yolo_data:
            with open(output_path, 'a') as f:
                f.write(f"{yolo_data[0]} {yolo_data[1]:.6f} {yolo_data[2]:.6f} {yolo_data[3]:.6f} {yolo_data[4]:.6f}\n")
In [ ]:
save_yolo_annotations(train_df, RUTA_CARPETA_KAGGLE_DATASET_YOLO_TRAIN)
save_yolo_annotations(test_df, RUTA_CARPETA_KAGGLE_DATASET_YOLO_TEST)

El modelo YOLO espera que las imágenes se encuentren en el mismo lugar que los archivos de texto con la información sobre la etiqueta. Se debe tener un archivo de texto con todas las etiquetas de las detecciones por cada una de las imágenes.

In [ ]:
# Compiamos las imágenes a las carpetas correspondientes (train y test)
for _, row in train_df.iterrows():
    shutil.copy(os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC_TRAIN_DATA, row['filename']), RUTA_CARPETA_KAGGLE_DATASET_YOLO_TRAIN)

for _, row in test_df.iterrows():
    shutil.copy(os.path.join(RUTA_CARPETA_KAGGLE_DATASET_VOC_TEST_DATA, row['filename']), RUTA_CARPETA_KAGGLE_DATASET_YOLO_TEST)

Entrenamiento con YOLOv8 y YOLOv11¶

In [ ]:
print(DEVICE)
cuda

Se entrenan dos modelos con una pequeña diferencia en los hiperparámetros: el tamaño de la imagen de entrada.

YOLO automaticamente optimiza el Learning Rate y el Momentum dependiendo del tamaño del dataset, las épocas, las iteraciones y las entradas (es decir, la complejidad), además de elegir el optimizador que considera mejor. Existe además una función adicional de Ajuste de hiperparámetros, pero no se pudo ejecutar en el equipo.

In [14]:
# Parámetros
YOLO_IMG_SIZE_1 = 640
YOLO_IMG_SIZE_2 = 480
YOLO_EPOCHS_1 = 100
YOLO_EPOCHS_2 = 200

# Parámetros para el entrenamiento de YOLO
params_1 = {
    "data": YOLOV8_CONFIG_YAML,
    "imgsz": YOLO_IMG_SIZE_1,
    "epochs": YOLO_EPOCHS_1,
    "device": DEVICE,
    "name": f'{NOMBRE_MODELO_YOLO}_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}_r0',
    "patience" : 20
}

params_2 = {
    "data": YOLOV8_CONFIG_YAML,
    "imgsz": YOLO_IMG_SIZE_2,
    "epochs": YOLO_EPOCHS_2,
    "device": DEVICE,
    "name": f'{NOMBRE_MODELO_YOLO}_imgsz{YOLO_IMG_SIZE_2}_epochs{YOLO_EPOCHS_2}_r0',
    "patience" : 20
}

Se utilizan los modelos "nano" en sus versiones YOLOv8 y YOLO 11. Ambos son modelos que están pre-entrenados, con pesos e hiperparámetros ya definidos. Estos modelos están entrenados en el dataset de COCO y es una buena base generalizable.

In [ ]:
# Se instancian los modelos YOLOv8 nano
model_yolov8_p1 = YOLO("yolov8n.pt")
model_yolov8_p2 = YOLO("yolov8n.pt") 

# Se instancian los modelos YOLOv11 nano
model_yolov11_p1 = YOLO("yolo11n.pt")
model_yolov11_p2 = YOLO("yolo11n.pt") 
In [ ]:
%tensorboard --logdir yolo/yolov8/train/palmeras_imgsz640_epochs10_lr0.001
In [ ]:
# Se entrena el modelo base
model_yolov8_p1.train(project=RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN, **params_1)
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
engine\trainer: task=detect, mode=train, model=yolov8n.pt, data=./resources\utils\yolov8.yaml, epochs=100, time=None, patience=20, batch=16, imgsz=640, save=True, save_period=-1, cache=False, device=cuda, workers=8, project=yolo\yolov8\train, name=palmeras_imgsz640_epochs100_r0, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=4, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, label_smoothing=0.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=yolo\yolov8\train\palmeras_imgsz640_epochs100_r0
Overriding model.yaml nc=80 with nc=2

                   from  n    params  module                                       arguments                     
  0                  -1  1       464  ultralytics.nn.modules.conv.Conv             [3, 16, 3, 2]                 
  1                  -1  1      4672  ultralytics.nn.modules.conv.Conv             [16, 32, 3, 2]                
  2                  -1  1      7360  ultralytics.nn.modules.block.C2f             [32, 32, 1, True]             
  3                  -1  1     18560  ultralytics.nn.modules.conv.Conv             [32, 64, 3, 2]                
  4                  -1  2     49664  ultralytics.nn.modules.block.C2f             [64, 64, 2, True]             
  5                  -1  1     73984  ultralytics.nn.modules.conv.Conv             [64, 128, 3, 2]               
  6                  -1  2    197632  ultralytics.nn.modules.block.C2f             [128, 128, 2, True]           
  7                  -1  1    295424  ultralytics.nn.modules.conv.Conv             [128, 256, 3, 2]              
  8                  -1  1    460288  ultralytics.nn.modules.block.C2f             [256, 256, 1, True]           
  9                  -1  1    164608  ultralytics.nn.modules.block.SPPF            [256, 256, 5]                 
 10                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 11             [-1, 6]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 12                  -1  1    148224  ultralytics.nn.modules.block.C2f             [384, 128, 1]                 
 13                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 14             [-1, 4]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 15                  -1  1     37248  ultralytics.nn.modules.block.C2f             [192, 64, 1]                  
 16                  -1  1     36992  ultralytics.nn.modules.conv.Conv             [64, 64, 3, 2]                
 17            [-1, 12]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 18                  -1  1    123648  ultralytics.nn.modules.block.C2f             [192, 128, 1]                 
 19                  -1  1    147712  ultralytics.nn.modules.conv.Conv             [128, 128, 3, 2]              
 20             [-1, 9]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 21                  -1  1    493056  ultralytics.nn.modules.block.C2f             [384, 256, 1]                 
 22        [15, 18, 21]  1    431062  ultralytics.nn.modules.head.Detect           [2, [64, 128, 256]]           
Model summary: 249 layers, 2,690,598 parameters, 2,690,582 gradients, 6.9 GFLOPs

Transferred 313/391 items from pretrained weights
TensorBoard: Start with 'tensorboard --logdir yolo\yolov8\train\palmeras_imgsz640_epochs100_r0', view at http://localhost:6006/
Freezing layer 'model.22.dfl.conv.weight'
AMP: running Automatic Mixed Precision (AMP) checks with YOLO11n...
AMP: checks passed 
train: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\train.cache... 279 images, 0 backgrounds, 0 corrupt: 100%|██████████| 279/279 [00:00<?, ?it/s]
val: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\test.cache... 70 images, 0 backgrounds, 0 corrupt: 100%|██████████| 70/70 [00:00<?, ?it/s]
Plotting labels to yolo\yolov8\train\palmeras_imgsz640_epochs100_r0\labels.jpg... 
optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... 
optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 63 weight(decay=0.0), 70 weight(decay=0.0005), 69 bias(decay=0.0)
TensorBoard: model graph visualization added 
Image sizes 640 train, 640 val
Using 8 dataloader workers
Logging results to yolo\yolov8\train\palmeras_imgsz640_epochs100_r0
Starting training for 100 epochs...

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      1/100      3.41G      2.093      3.675      1.648        438        640: 100%|██████████| 18/18 [00:06<00:00,  2.97it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.75it/s]
                   all         70       2670     0.0727      0.481       0.16     0.0684

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      2/100      3.85G       1.55      2.833      1.149        528        640: 100%|██████████| 18/18 [00:04<00:00,  3.89it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.78it/s]
                   all         70       2670      0.101      0.601      0.369      0.195

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      3/100      3.75G      1.569      1.959      1.134        435        640: 100%|██████████| 18/18 [00:04<00:00,  4.24it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.53it/s]
                   all         70       2670     0.0979      0.541      0.325      0.163

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      4/100      3.68G      1.506      1.442      1.143        262        640: 100%|██████████| 18/18 [00:04<00:00,  4.25it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.93it/s]
                   all         70       2670      0.114      0.593      0.345      0.171

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      5/100      3.33G      1.422      1.167      1.138        266        640: 100%|██████████| 18/18 [00:04<00:00,  4.48it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.36it/s]
                   all         70       2670      0.133      0.751      0.461      0.261

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      6/100      3.23G      1.414        1.1      1.122        204        640: 100%|██████████| 18/18 [00:03<00:00,  4.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.02it/s]
                   all         70       2670      0.629      0.382      0.515      0.291

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      7/100      3.36G      1.416       1.01      1.125        408        640: 100%|██████████| 18/18 [00:04<00:00,  4.05it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.36it/s]
                   all         70       2670       0.63      0.547      0.576        0.3

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      8/100      4.64G      1.367     0.9549      1.098        477        640: 100%|██████████| 18/18 [00:05<00:00,  3.50it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.35it/s]
                   all         70       2670      0.601      0.557      0.596        0.3

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      9/100      3.47G      1.346      0.925      1.096        449        640: 100%|██████████| 18/18 [00:03<00:00,  5.15it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.13it/s]
                   all         70       2670       0.61      0.586      0.619       0.33

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     10/100      4.63G      1.326      0.879      1.092        383        640: 100%|██████████| 18/18 [00:03<00:00,  4.70it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.19it/s]
                   all         70       2670      0.705      0.646      0.695      0.374

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     11/100      3.86G      1.324      0.853      1.096        427        640: 100%|██████████| 18/18 [00:03<00:00,  5.00it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.43it/s]
                   all         70       2670      0.638      0.626      0.669      0.357

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     12/100      3.71G      1.322     0.8324      1.088        274        640: 100%|██████████| 18/18 [00:03<00:00,  4.70it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.68it/s]
                   all         70       2670       0.75      0.641      0.693      0.378

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     13/100       3.7G      1.342     0.8314      1.093        532        640: 100%|██████████| 18/18 [00:03<00:00,  4.84it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.79it/s]
                   all         70       2670      0.668      0.698      0.715      0.391

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     14/100       3.7G      1.338     0.8033      1.084        287        640: 100%|██████████| 18/18 [00:03<00:00,  4.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.94it/s]
                   all         70       2670      0.696      0.679      0.704      0.389

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     15/100      3.08G      1.324     0.8228      1.084        386        640: 100%|██████████| 18/18 [00:03<00:00,  4.86it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.30it/s]
                   all         70       2670      0.742      0.706      0.744      0.409

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     16/100      3.97G      1.282     0.7809      1.077        537        640: 100%|██████████| 18/18 [00:03<00:00,  4.55it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.69it/s]
                   all         70       2670      0.713       0.72      0.736      0.395

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     17/100      3.71G      1.341     0.7713       1.09        366        640: 100%|██████████| 18/18 [00:04<00:00,  4.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.46it/s]
                   all         70       2670      0.645       0.58      0.638      0.344

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     18/100      3.82G      1.325     0.7754      1.088        476        640: 100%|██████████| 18/18 [00:03<00:00,  4.60it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:02<00:00,  1.25it/s]
                   all         70       2670      0.685      0.724      0.695      0.381

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     19/100      3.26G      1.316     0.7701      1.074        444        640: 100%|██████████| 18/18 [00:03<00:00,  4.97it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.39it/s]
                   all         70       2670      0.732      0.695      0.731      0.407

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     20/100      3.93G      1.274     0.7366      1.078        423        640: 100%|██████████| 18/18 [00:04<00:00,  4.49it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.73it/s]
                   all         70       2670      0.715      0.712      0.743      0.413

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     21/100      4.04G      1.283     0.7247      1.079        379        640: 100%|██████████| 18/18 [00:03<00:00,  5.11it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.31it/s]
                   all         70       2670      0.725      0.752      0.754      0.418

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     22/100      3.92G      1.268     0.7268      1.072        634        640: 100%|██████████| 18/18 [00:03<00:00,  5.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.30it/s]
                   all         70       2670      0.734       0.75       0.76      0.425

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     23/100      4.26G       1.26     0.7065      1.061        520        640: 100%|██████████| 18/18 [00:03<00:00,  5.06it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.87it/s]
                   all         70       2670      0.729      0.767      0.772      0.422

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     24/100      4.53G      1.233     0.7116      1.063        351        640: 100%|██████████| 18/18 [00:03<00:00,  4.82it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.92it/s]
                   all         70       2670      0.716      0.722      0.735      0.416

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     25/100      3.17G      1.245     0.7084       1.07        454        640: 100%|██████████| 18/18 [00:03<00:00,  5.07it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.96it/s]
                   all         70       2670      0.765      0.793      0.818      0.447

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     26/100      4.12G      1.249     0.6985       1.05        441        640: 100%|██████████| 18/18 [00:03<00:00,  4.81it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.63it/s]
                   all         70       2670      0.772      0.759      0.785      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     27/100      3.34G       1.25     0.6992      1.059        538        640: 100%|██████████| 18/18 [00:03<00:00,  4.89it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.81it/s]
                   all         70       2670      0.699      0.793      0.759      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     28/100      4.26G      1.237     0.6898      1.065        373        640: 100%|██████████| 18/18 [00:04<00:00,  3.91it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.77it/s]
                   all         70       2670       0.72      0.769      0.768      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     29/100      3.47G      1.236     0.6788      1.052        387        640: 100%|██████████| 18/18 [00:03<00:00,  4.74it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:02<00:00,  1.23it/s]
                   all         70       2670      0.766      0.732      0.776      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     30/100      3.92G       1.25     0.6893      1.055        568        640: 100%|██████████| 18/18 [00:03<00:00,  4.57it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.07it/s]
                   all         70       2670      0.777      0.756      0.783      0.431

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     31/100      3.75G      1.234     0.6713      1.052        272        640: 100%|██████████| 18/18 [00:03<00:00,  4.67it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.07it/s]
                   all         70       2670      0.768      0.787      0.798      0.433

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     32/100      3.89G      1.228     0.6708      1.065        362        640: 100%|██████████| 18/18 [00:03<00:00,  4.84it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.44it/s]
                   all         70       2670      0.747      0.743      0.747       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     33/100      3.74G      1.211     0.6659      1.053        374        640: 100%|██████████| 18/18 [00:03<00:00,  4.66it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.35it/s]
                   all         70       2670      0.825      0.722      0.801      0.452

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     34/100      4.25G      1.257     0.6712      1.055        434        640: 100%|██████████| 18/18 [00:04<00:00,  4.42it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.88it/s]
                   all         70       2670      0.765      0.756      0.782      0.432

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     35/100      4.18G      1.241     0.6659      1.043        524        640: 100%|██████████| 18/18 [00:03<00:00,  4.79it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.58it/s]
                   all         70       2670      0.746      0.779      0.783      0.437

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     36/100      3.84G      1.235     0.6513      1.052        306        640: 100%|██████████| 18/18 [00:04<00:00,  4.25it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.01it/s]
                   all         70       2670      0.742      0.767      0.779      0.436

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     37/100      3.32G      1.236     0.6644      1.059        322        640: 100%|██████████| 18/18 [00:03<00:00,  4.65it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.11it/s]
                   all         70       2670      0.798      0.755      0.795      0.438

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     38/100       3.9G       1.24     0.6505      1.055        347        640: 100%|██████████| 18/18 [00:03<00:00,  4.92it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.10it/s]
                   all         70       2670      0.769      0.765      0.786      0.418

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     39/100      3.96G      1.227     0.6525      1.054        438        640: 100%|██████████| 18/18 [00:03<00:00,  4.87it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.03it/s]
                   all         70       2670       0.78       0.79      0.804      0.451

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     40/100      4.16G      1.211     0.6353      1.052        433        640: 100%|██████████| 18/18 [00:03<00:00,  4.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.01it/s]
                   all         70       2670      0.798      0.768      0.806      0.453

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     41/100      3.92G      1.204     0.6337      1.043        384        640: 100%|██████████| 18/18 [00:03<00:00,  4.68it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.67it/s]
                   all         70       2670      0.764      0.782      0.807      0.448

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     42/100      3.47G      1.229     0.6529      1.055        343        640: 100%|██████████| 18/18 [00:03<00:00,  4.74it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.90it/s]
                   all         70       2670      0.773      0.769      0.791      0.435

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     43/100      3.89G       1.22     0.6427      1.055        419        640: 100%|██████████| 18/18 [00:04<00:00,  4.29it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.82it/s]
                   all         70       2670      0.791       0.76      0.799      0.446

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     44/100      3.61G      1.215      0.632      1.041        620        640: 100%|██████████| 18/18 [00:03<00:00,  5.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.51it/s]
                   all         70       2670       0.75      0.771      0.776      0.441

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     45/100      3.41G      1.174     0.6115      1.034        497        640: 100%|██████████| 18/18 [00:03<00:00,  4.70it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.18it/s]
                   all         70       2670      0.776      0.774      0.793      0.444

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     46/100      3.64G      1.211     0.6287      1.045        539        640: 100%|██████████| 18/18 [00:04<00:00,  4.44it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.13it/s]
                   all         70       2670      0.773      0.792      0.802      0.451

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     47/100      3.07G      1.197     0.6211      1.046        293        640: 100%|██████████| 18/18 [00:04<00:00,  4.43it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.08it/s]
                   all         70       2670      0.774      0.798      0.799      0.447

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     48/100      4.37G      1.183     0.6181      1.042        352        640: 100%|██████████| 18/18 [00:08<00:00,  2.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.46it/s]
                   all         70       2670      0.764      0.771      0.784      0.443

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     49/100      3.45G      1.179     0.6119      1.041        541        640: 100%|██████████| 18/18 [00:03<00:00,  5.33it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.79it/s]
                   all         70       2670       0.79      0.774      0.811      0.449

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     50/100      4.07G      1.214     0.6321      1.047        407        640: 100%|██████████| 18/18 [00:03<00:00,  5.15it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.75it/s]
                   all         70       2670      0.783      0.777      0.797       0.44

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     51/100      3.39G      1.191     0.6088      1.046        496        640: 100%|██████████| 18/18 [00:03<00:00,  4.62it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.93it/s]
                   all         70       2670      0.767      0.784      0.789      0.442

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     52/100       3.9G      1.191     0.6043      1.035        465        640: 100%|██████████| 18/18 [00:03<00:00,  4.66it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.13it/s]
                   all         70       2670      0.765       0.78      0.791      0.441

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     53/100      3.95G      1.194     0.6015      1.041        553        640: 100%|██████████| 18/18 [00:03<00:00,  4.82it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.00it/s]
                   all         70       2670      0.763      0.765      0.777      0.435

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     54/100      4.12G       1.16     0.6032      1.036        449        640: 100%|██████████| 18/18 [00:03<00:00,  4.87it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.26it/s]
                   all         70       2670      0.774      0.765      0.791      0.443

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     55/100      3.61G      1.175     0.5983      1.038        262        640: 100%|██████████| 18/18 [00:03<00:00,  4.87it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.36it/s]
                   all         70       2670      0.786      0.789      0.806       0.45

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     56/100      3.68G      1.163     0.5845      1.025        475        640: 100%|██████████| 18/18 [00:04<00:00,  3.99it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.23it/s]
                   all         70       2670      0.797      0.796      0.813      0.445

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     57/100       4.1G      1.167     0.6046      1.044        252        640: 100%|██████████| 18/18 [00:03<00:00,  5.36it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.73it/s]
                   all         70       2670      0.796      0.785      0.814      0.447

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     58/100      4.23G      1.167     0.5943      1.031        550        640: 100%|██████████| 18/18 [00:03<00:00,  5.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.83it/s]
                   all         70       2670      0.801      0.757      0.801      0.442

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     59/100      4.71G       1.17     0.6012      1.033        315        640: 100%|██████████| 18/18 [00:07<00:00,  2.28it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.35it/s]
                   all         70       2670      0.748      0.802      0.786       0.44

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     60/100      3.49G      1.162     0.5909      1.036        474        640: 100%|██████████| 18/18 [00:03<00:00,  5.38it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.89it/s]
                   all         70       2670      0.762      0.791      0.782      0.449
EarlyStopping: Training stopped early as no improvement observed in last 20 epochs. Best results observed at epoch 40, best model saved as best.pt.
To update EarlyStopping(patience=20) pass a new patience value, i.e. `patience=300` or use `patience=0` to disable EarlyStopping.

60 epochs completed in 0.109 hours.
Optimizer stripped from yolo\yolov8\train\palmeras_imgsz640_epochs100_r0\weights\last.pt, 5.6MB
Optimizer stripped from yolo\yolov8\train\palmeras_imgsz640_epochs100_r0\weights\best.pt, 5.6MB

Validating yolo\yolov8\train\palmeras_imgsz640_epochs100_r0\weights\best.pt...
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
Model summary (fused): 186 layers, 2,684,758 parameters, 0 gradients, 6.8 GFLOPs
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:03<00:00,  1.12s/it]
                   all         70       2670      0.798      0.768      0.806      0.453
                  Tree         43        325      0.668        0.6      0.644      0.303
                  Palm         66       2345      0.928      0.936      0.968      0.604
Speed: 1.0ms preprocess, 3.5ms inference, 0.0ms loss, 7.0ms postprocess per image
Results saved to yolo\yolov8\train\palmeras_imgsz640_epochs100_r0
Out[ ]:
ultralytics.utils.metrics.DetMetrics object with attributes:

ap_class_index: array([0, 1])
box: ultralytics.utils.metrics.Metric object
confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x000002258AD1CAD0>
curves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)']
curves_results: [[array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,    0.001345,  0.00067252,           0],
       [          1,           1,           1, ...,    0.028517,    0.014258,           0]]), 'Recall', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[     0.0792,      0.0792,    0.080565, ...,           0,           0,           0],
       [    0.29021,     0.29021,     0.29508, ...,           0,           0,           0]]), 'Confidence', 'F1'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.041345,    0.041345,    0.042089, ...,           1,           1,           1],
       [    0.17008,     0.17008,     0.17344, ...,           1,           1,           1]]), 'Confidence', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.93846,     0.93846,     0.93846, ...,           0,           0,           0],
       [    0.98806,     0.98806,     0.98806, ...,           0,           0,           0]]), 'Confidence', 'Recall']]
fitness: 0.4886148220553541
keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']
maps: array([    0.30274,     0.60396])
names: {0: 'Tree', 1: 'Palm'}
plot: True
results_dict: {'metrics/precision(B)': 0.7980907820130512, 'metrics/recall(B)': 0.7680170575692964, 'metrics/mAP50(B)': 0.8059971252024734, 'metrics/mAP50-95(B)': 0.4533501217056742, 'fitness': 0.4886148220553541}
save_dir: WindowsPath('yolo/yolov8/train/palmeras_imgsz640_epochs100_r0')
speed: {'preprocess': 0.9570871080671038, 'inference': 3.5302979605538503, 'loss': 0.0, 'postprocess': 7.015698296683175}
task: 'detect'
In [ ]:
# Se entrena el modelo base
model_yolov8_p2.train(project=RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN, **params_2)
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
engine\trainer: task=detect, mode=train, model=yolov8n.pt, data=./resources\utils\yolov8.yaml, epochs=200, time=None, patience=20, batch=16, imgsz=480, save=True, save_period=-1, cache=False, device=cuda, workers=8, project=yolo\yolov8\train, name=palmeras_imgsz480_epochs200_r0, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=4, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, label_smoothing=0.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=yolo\yolov8\train\palmeras_imgsz480_epochs200_r0
Overriding model.yaml nc=80 with nc=2

                   from  n    params  module                                       arguments                     
  0                  -1  1       464  ultralytics.nn.modules.conv.Conv             [3, 16, 3, 2]                 
  1                  -1  1      4672  ultralytics.nn.modules.conv.Conv             [16, 32, 3, 2]                
  2                  -1  1      7360  ultralytics.nn.modules.block.C2f             [32, 32, 1, True]             
  3                  -1  1     18560  ultralytics.nn.modules.conv.Conv             [32, 64, 3, 2]                
  4                  -1  2     49664  ultralytics.nn.modules.block.C2f             [64, 64, 2, True]             
  5                  -1  1     73984  ultralytics.nn.modules.conv.Conv             [64, 128, 3, 2]               
  6                  -1  2    197632  ultralytics.nn.modules.block.C2f             [128, 128, 2, True]           
  7                  -1  1    295424  ultralytics.nn.modules.conv.Conv             [128, 256, 3, 2]              
  8                  -1  1    460288  ultralytics.nn.modules.block.C2f             [256, 256, 1, True]           
  9                  -1  1    164608  ultralytics.nn.modules.block.SPPF            [256, 256, 5]                 
 10                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 11             [-1, 6]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 12                  -1  1    148224  ultralytics.nn.modules.block.C2f             [384, 128, 1]                 
 13                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 14             [-1, 4]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 15                  -1  1     37248  ultralytics.nn.modules.block.C2f             [192, 64, 1]                  
 16                  -1  1     36992  ultralytics.nn.modules.conv.Conv             [64, 64, 3, 2]                
 17            [-1, 12]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 18                  -1  1    123648  ultralytics.nn.modules.block.C2f             [192, 128, 1]                 
 19                  -1  1    147712  ultralytics.nn.modules.conv.Conv             [128, 128, 3, 2]              
 20             [-1, 9]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 21                  -1  1    493056  ultralytics.nn.modules.block.C2f             [384, 256, 1]                 
 22        [15, 18, 21]  1    431062  ultralytics.nn.modules.head.Detect           [2, [64, 128, 256]]           
Model summary: 249 layers, 2,690,598 parameters, 2,690,582 gradients, 6.9 GFLOPs

Transferred 313/391 items from pretrained weights
TensorBoard: Start with 'tensorboard --logdir yolo\yolov8\train\palmeras_imgsz480_epochs200_r0', view at http://localhost:6006/
Freezing layer 'model.22.dfl.conv.weight'
AMP: running Automatic Mixed Precision (AMP) checks with YOLO11n...
AMP: checks passed 
train: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\train.cache... 279 images, 0 backgrounds, 0 corrupt: 100%|██████████| 279/279 [00:00<?, ?it/s]
val: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\test.cache... 70 images, 0 backgrounds, 0 corrupt: 100%|██████████| 70/70 [00:00<?, ?it/s]
Plotting labels to yolo\yolov8\train\palmeras_imgsz480_epochs200_r0\labels.jpg... 
optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... 
optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 63 weight(decay=0.0), 70 weight(decay=0.0005), 69 bias(decay=0.0)
TensorBoard: model graph visualization added 
Image sizes 480 train, 480 val
Using 8 dataloader workers
Logging results to yolo\yolov8\train\palmeras_imgsz480_epochs200_r0
Starting training for 200 epochs...

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      1/200      1.99G      2.362      3.779      1.665        439        480: 100%|██████████| 18/18 [00:04<00:00,  4.17it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.64it/s]
                   all         70       2670     0.0534      0.338     0.0429     0.0199

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      2/200      2.23G      1.642      3.137      1.153        527        480: 100%|██████████| 18/18 [00:04<00:00,  3.73it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.75it/s]
                   all         70       2670     0.0898      0.536      0.272      0.117

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      3/200      2.18G      1.644      2.239      1.106        435        480: 100%|██████████| 18/18 [00:04<00:00,  4.24it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.15it/s]
                   all         70       2670      0.096      0.464      0.205     0.0755

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      4/200      2.14G      1.589      1.612      1.096        262        480: 100%|██████████| 18/18 [00:03<00:00,  4.65it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.04it/s]
                   all         70       2670      0.121       0.51      0.261      0.128

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      5/200      1.94G      1.514      1.269      1.086        265        480: 100%|██████████| 18/18 [00:03<00:00,  4.68it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.91it/s]
                   all         70       2670      0.789      0.413      0.408      0.219

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      6/200      1.88G      1.489      1.196      1.071        204        480: 100%|██████████| 18/18 [00:03<00:00,  4.96it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.51it/s]
                   all         70       2670       0.91      0.368      0.434      0.234

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      7/200      1.96G      1.449      1.059      1.063        408        480: 100%|██████████| 18/18 [00:04<00:00,  4.25it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.52it/s]
                   all         70       2670      0.421      0.407      0.446      0.246

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      8/200      2.68G       1.42     0.9878      1.049        477        480: 100%|██████████| 18/18 [00:04<00:00,  4.35it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.37it/s]
                   all         70       2670      0.399      0.492      0.483      0.265

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      9/200      2.02G      1.407     0.9554      1.062        449        480: 100%|██████████| 18/18 [00:03<00:00,  5.61it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.19it/s]
                   all         70       2670      0.603        0.5       0.53        0.3

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     10/200      2.67G      1.371     0.9196      1.045        383        480: 100%|██████████| 18/18 [00:02<00:00,  6.75it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.67it/s]
                   all         70       2670       0.57       0.54      0.555      0.319

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     11/200      2.24G      1.363     0.8758      1.048        426        480: 100%|██████████| 18/18 [00:04<00:00,  4.46it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.30it/s]
                   all         70       2670      0.686      0.568       0.61      0.327

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     12/200      2.16G      1.384     0.8511       1.05        274        480: 100%|██████████| 18/18 [00:03<00:00,  5.35it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.91it/s]
                   all         70       2670      0.591      0.635      0.636      0.337

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     13/200      2.15G      1.409      0.847      1.054        531        480: 100%|██████████| 18/18 [00:02<00:00,  7.01it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.45it/s]
                   all         70       2670      0.601      0.646      0.643      0.346

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     14/200      2.15G       1.38     0.8169      1.039        287        480: 100%|██████████| 18/18 [00:03<00:00,  5.07it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.35it/s]
                   all         70       2670      0.649      0.606       0.64      0.357

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     15/200      1.86G      1.381     0.8255      1.047        382        480: 100%|██████████| 18/18 [00:03<00:00,  5.80it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.10it/s]
                   all         70       2670      0.703      0.635      0.669       0.36

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     16/200       2.3G      1.324     0.7819      1.041        540        480: 100%|██████████| 18/18 [00:03<00:00,  5.13it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.29it/s]
                   all         70       2670      0.664      0.677      0.686      0.362

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     17/200      2.16G      1.337     0.7503       1.03        366        480: 100%|██████████| 18/18 [00:02<00:00,  6.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.80it/s]
                   all         70       2670      0.675      0.649      0.689      0.388

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     18/200      2.23G      1.328     0.7551      1.036        476        480: 100%|██████████| 18/18 [00:03<00:00,  5.43it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.09it/s]
                   all         70       2670       0.71      0.695      0.714      0.389

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     19/200      1.91G      1.342     0.7688      1.028        444        480: 100%|██████████| 18/18 [00:03<00:00,  5.49it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.37it/s]
                   all         70       2670      0.672      0.621      0.669       0.37

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     20/200      2.28G      1.304     0.7295      1.032        423        480: 100%|██████████| 18/18 [00:03<00:00,  4.68it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.41it/s]
                   all         70       2670      0.671      0.636      0.654      0.364

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     21/200      2.35G      1.323     0.7223      1.037        379        480: 100%|██████████| 18/18 [00:02<00:00,  7.29it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.30it/s]
                   all         70       2670      0.748      0.695      0.729      0.395

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     22/200      2.27G      1.327     0.7268      1.033        632        480: 100%|██████████| 18/18 [00:03<00:00,  4.90it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.33it/s]
                   all         70       2670      0.738      0.715      0.731      0.397

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     23/200      2.47G      1.314     0.7157      1.023        520        480: 100%|██████████| 18/18 [00:02<00:00,  7.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.30it/s]
                   all         70       2670       0.79      0.701      0.757      0.408

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     24/200      2.62G      1.268     0.7115      1.029        351        480: 100%|██████████| 18/18 [00:03<00:00,  5.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.42it/s]
                   all         70       2670      0.746       0.69      0.717      0.389

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     25/200      1.86G      1.285     0.7129      1.033        455        480: 100%|██████████| 18/18 [00:03<00:00,  5.45it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.12it/s]
                   all         70       2670      0.779      0.709      0.749      0.404

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     26/200      2.39G      1.298     0.7082      1.019        440        480: 100%|██████████| 18/18 [00:02<00:00,  6.33it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.99it/s]
                   all         70       2670      0.736      0.651      0.699      0.366

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     27/200      1.96G      1.295     0.7095      1.023        538        480: 100%|██████████| 18/18 [00:03<00:00,  5.28it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.24it/s]
                   all         70       2670       0.74      0.696      0.726      0.391

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     28/200      2.46G      1.286      0.699      1.033        372        480: 100%|██████████| 18/18 [00:03<00:00,  5.15it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.66it/s]
                   all         70       2670      0.701      0.652      0.678      0.373

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     29/200      2.02G      1.293     0.6926      1.021        387        480: 100%|██████████| 18/18 [00:03<00:00,  5.73it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.64it/s]
                   all         70       2670      0.744       0.69      0.714      0.373

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     30/200      2.28G      1.288     0.6927      1.017        568        480: 100%|██████████| 18/18 [00:02<00:00,  6.26it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.68it/s]
                   all         70       2670      0.775      0.709      0.756       0.41

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     31/200      2.18G      1.271       0.67      1.016        272        480: 100%|██████████| 18/18 [00:03<00:00,  4.79it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.55it/s]
                   all         70       2670      0.767      0.706      0.748      0.408

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     32/200      2.25G      1.277     0.6763      1.032        363        480: 100%|██████████| 18/18 [00:03<00:00,  5.26it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.88it/s]
                   all         70       2670      0.768      0.709      0.739      0.399

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     33/200      2.17G      1.254     0.6679       1.02        375        480: 100%|██████████| 18/18 [00:03<00:00,  5.05it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.61it/s]
                   all         70       2670      0.763      0.732       0.77      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     34/200      2.46G      1.292     0.6827      1.016        434        480: 100%|██████████| 18/18 [00:02<00:00,  6.26it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.29it/s]
                   all         70       2670      0.765      0.725      0.757      0.407

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     35/200      2.42G      1.271     0.6714      1.009        523        480: 100%|██████████| 18/18 [00:04<00:00,  4.40it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.53it/s]
                   all         70       2670      0.694      0.684      0.704      0.393

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     36/200      2.23G      1.278       0.66      1.016        306        480: 100%|██████████| 18/18 [00:04<00:00,  4.24it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.45it/s]
                   all         70       2670       0.71      0.729       0.73      0.404

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     37/200      1.92G      1.277     0.6603      1.023        321        480: 100%|██████████| 18/18 [00:02<00:00,  6.56it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.41it/s]
                   all         70       2670      0.721      0.708      0.728      0.404

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     38/200      2.27G       1.28     0.6568       1.02        347        480: 100%|██████████| 18/18 [00:03<00:00,  5.96it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.51it/s]
                   all         70       2670      0.713      0.732      0.744      0.395

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     39/200       2.3G      1.272     0.6555      1.022        439        480: 100%|██████████| 18/18 [00:03<00:00,  5.20it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.51it/s]
                   all         70       2670      0.758      0.722      0.758      0.415

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     40/200      2.41G      1.251     0.6471      1.019        433        480: 100%|██████████| 18/18 [00:02<00:00,  6.46it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.41it/s]
                   all         70       2670      0.795      0.718       0.76      0.413

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     41/200      2.28G      1.249     0.6418      1.015        384        480: 100%|██████████| 18/18 [00:03<00:00,  5.07it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.49it/s]
                   all         70       2670      0.761      0.723      0.757      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     42/200      2.24G      1.268     0.6577      1.022        342        480: 100%|██████████| 18/18 [00:04<00:00,  4.37it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.26it/s]
                   all         70       2670      0.739      0.728      0.753      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     43/200      2.26G      1.251     0.6487      1.019        419        480: 100%|██████████| 18/18 [00:03<00:00,  5.80it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.25it/s]
                   all         70       2670      0.702       0.72       0.73       0.41

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     44/200       2.1G      1.255     0.6384      1.003        619        480: 100%|██████████| 18/18 [00:03<00:00,  5.57it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.03it/s]
                   all         70       2670       0.76      0.699      0.746      0.408

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     45/200      1.99G      1.228     0.6281          1        497        480: 100%|██████████| 18/18 [00:03<00:00,  4.69it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.03it/s]
                   all         70       2670      0.728      0.748       0.75      0.415

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     46/200      2.12G      1.246     0.6377      1.008        537        480: 100%|██████████| 18/18 [00:02<00:00,  7.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.86it/s]
                   all         70       2670      0.757      0.709      0.743      0.408

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     47/200       1.8G       1.25     0.6334      1.014        293        480: 100%|██████████| 18/18 [00:03<00:00,  5.35it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.13it/s]
                   all         70       2670      0.718      0.742      0.747      0.407

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     48/200      2.54G       1.23     0.6318       1.01        353        480: 100%|██████████| 18/18 [00:04<00:00,  3.67it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.59it/s]
                   all         70       2670      0.734      0.736      0.754      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     49/200      2.01G      1.227     0.6209      1.011        540        480: 100%|██████████| 18/18 [00:02<00:00,  6.05it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.46it/s]
                   all         70       2670      0.725      0.735       0.76       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     50/200      2.36G       1.26     0.6413      1.013        407        480: 100%|██████████| 18/18 [00:03<00:00,  4.75it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.75it/s]
                   all         70       2670      0.797      0.737      0.768      0.423

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     51/200      1.98G      1.243     0.6262      1.017        495        480: 100%|██████████| 18/18 [00:02<00:00,  7.07it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.74it/s]
                   all         70       2670      0.766      0.699      0.724      0.405

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     52/200      2.26G      1.246     0.6194      1.008        466        480: 100%|██████████| 18/18 [00:03<00:00,  4.81it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.06it/s]
                   all         70       2670       0.75      0.751      0.761      0.416

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     53/200       2.3G      1.244     0.6157      1.012        553        480: 100%|██████████| 18/18 [00:02<00:00,  6.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.27it/s]
                   all         70       2670      0.719      0.744      0.749       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     54/200      2.39G      1.213     0.6196       1.01        449        480: 100%|██████████| 18/18 [00:02<00:00,  6.41it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.57it/s]
                   all         70       2670      0.754      0.741      0.766      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     55/200      2.11G      1.221     0.6142      1.007        261        480: 100%|██████████| 18/18 [00:03<00:00,  5.40it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.71it/s]
                   all         70       2670      0.767       0.76      0.779      0.423

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     56/200      2.14G      1.211     0.6022     0.9943        475        480: 100%|██████████| 18/18 [00:03<00:00,  5.06it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.99it/s]
                   all         70       2670      0.802      0.739      0.778      0.426

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     57/200      2.38G      1.227      0.629      1.019        252        480: 100%|██████████| 18/18 [00:03<00:00,  4.59it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.13it/s]
                   all         70       2670      0.774      0.747      0.781      0.412

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     58/200      2.45G      1.229     0.6188      1.006        549        480: 100%|██████████| 18/18 [00:02<00:00,  6.41it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.27it/s]
                   all         70       2670      0.779      0.739      0.761      0.409

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     59/200      2.79G      1.234     0.6251      1.011        315        480: 100%|██████████| 18/18 [00:04<00:00,  4.47it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.50it/s]
                   all         70       2670      0.745      0.735       0.76      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     60/200      2.04G      1.224     0.6097      1.011        475        480: 100%|██████████| 18/18 [00:03<00:00,  5.40it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.79it/s]
                   all         70       2670      0.749      0.754      0.775      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     61/200      2.28G      1.234     0.6065     0.9964        547        480: 100%|██████████| 18/18 [00:03<00:00,  5.65it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.65it/s]
                   all         70       2670      0.762       0.76      0.779      0.414

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     62/200      2.37G      1.231     0.6064      1.012        360        480: 100%|██████████| 18/18 [00:02<00:00,  6.91it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.17it/s]
                   all         70       2670      0.775       0.75      0.761      0.408

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     63/200      2.33G      1.209     0.5939     0.9981        473        480: 100%|██████████| 18/18 [00:03<00:00,  4.75it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.68it/s]
                   all         70       2670      0.786      0.743      0.781      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     64/200      2.69G      1.225     0.5912      0.999        395        480: 100%|██████████| 18/18 [00:04<00:00,  4.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.08it/s]
                   all         70       2670      0.795       0.75      0.775      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     65/200         2G      1.202     0.5903     0.9968        511        480: 100%|██████████| 18/18 [00:03<00:00,  4.61it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.47it/s]
                   all         70       2670      0.753      0.746      0.765      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     66/200      2.05G       1.22     0.6013      1.006        411        480: 100%|██████████| 18/18 [00:03<00:00,  5.89it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.40it/s]
                   all         70       2670      0.761      0.776      0.792      0.449

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     67/200      2.21G      1.177     0.5806      1.004        449        480: 100%|██████████| 18/18 [00:02<00:00,  6.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.98it/s]
                   all         70       2670      0.749       0.72       0.75      0.423

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     68/200       2.7G      1.245       0.61      1.013        392        480: 100%|██████████| 18/18 [00:04<00:00,  4.13it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.37it/s]
                   all         70       2670      0.799      0.739      0.779      0.424

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     69/200      2.17G      1.217      0.596      1.006        425        480: 100%|██████████| 18/18 [00:02<00:00,  6.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.65it/s]
                   all         70       2670      0.792      0.752      0.783      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     70/200      2.17G      1.195     0.5828      1.005        306        480: 100%|██████████| 18/18 [00:03<00:00,  5.25it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.92it/s]
                   all         70       2670      0.776      0.729      0.763      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     71/200      1.99G      1.179     0.5714     0.9992        504        480: 100%|██████████| 18/18 [00:03<00:00,  5.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.33it/s]
                   all         70       2670      0.782      0.747      0.762      0.425

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     72/200      2.36G      1.234     0.5986      1.019        321        480: 100%|██████████| 18/18 [00:04<00:00,  3.86it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.62it/s]
                   all         70       2670      0.762      0.766       0.78      0.424

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     73/200      2.34G      1.205     0.5815     0.9973        497        480: 100%|██████████| 18/18 [00:02<00:00,  7.04it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.68it/s]
                   all         70       2670      0.814       0.75      0.785      0.425

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     74/200      2.28G      1.167      0.568      0.986        548        480: 100%|██████████| 18/18 [00:03<00:00,  5.63it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.75it/s]
                   all         70       2670      0.774      0.748       0.78      0.433

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     75/200      2.04G      1.182     0.5755     0.9858        667        480: 100%|██████████| 18/18 [00:03<00:00,  5.93it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.46it/s]
                   all         70       2670      0.767      0.747      0.772      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     76/200       2.2G      1.203     0.5771     0.9962        537        480: 100%|██████████| 18/18 [00:04<00:00,  4.46it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.38it/s]
                   all         70       2670      0.763      0.716      0.758      0.422

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     77/200      1.89G      1.186     0.5712     0.9981        468        480: 100%|██████████| 18/18 [00:03<00:00,  4.89it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.24it/s]
                   all         70       2670      0.806      0.741      0.779      0.433

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     78/200      2.47G        1.2     0.5785     0.9911        520        480: 100%|██████████| 18/18 [00:03<00:00,  5.01it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.52it/s]
                   all         70       2670      0.787       0.74      0.789      0.435

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     79/200       2.2G      1.177     0.5775     0.9944        649        480: 100%|██████████| 18/18 [00:03<00:00,  5.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.15it/s]
                   all         70       2670      0.796      0.753       0.79      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     80/200      1.83G      1.204     0.5809      1.011        302        480: 100%|██████████| 18/18 [00:02<00:00,  6.13it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.97it/s]
                   all         70       2670      0.745      0.776      0.765      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     81/200       2.1G      1.177     0.5659     0.9925        372        480: 100%|██████████| 18/18 [00:03<00:00,  5.36it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.61it/s]
                   all         70       2670      0.777      0.726      0.764       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     82/200      2.63G      1.183      0.569      1.005        436        480: 100%|██████████| 18/18 [00:02<00:00,  6.25it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.18it/s]
                   all         70       2670      0.782      0.737      0.775      0.416

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     83/200      2.07G      1.185     0.5792     0.9933        445        480: 100%|██████████| 18/18 [00:02<00:00,  6.50it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.27it/s]
                   all         70       2670      0.773      0.711      0.745       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     84/200      1.77G      1.179     0.5753     0.9952        717        480: 100%|██████████| 18/18 [00:05<00:00,  3.37it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.71it/s]
                   all         70       2670      0.751      0.729      0.739      0.412

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     85/200      1.97G      1.189     0.5715     0.9913        493        480: 100%|██████████| 18/18 [00:03<00:00,  4.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.99it/s]
                   all         70       2670       0.79      0.759      0.783      0.429

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     86/200      2.33G       1.18     0.5739     0.9924        366        480: 100%|██████████| 18/18 [00:02<00:00,  6.66it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.11it/s]
                   all         70       2670      0.783      0.756      0.777      0.424

EarlyStopping: Training stopped early as no improvement observed in last 20 epochs. Best results observed at epoch 66, best model saved as best.pt.
To update EarlyStopping(patience=20) pass a new patience value, i.e. `patience=300` or use `patience=0` to disable EarlyStopping.

86 epochs completed in 0.134 hours.
Optimizer stripped from yolo\yolov8\train\palmeras_imgsz480_epochs200_r0\weights\last.pt, 5.6MB
Optimizer stripped from yolo\yolov8\train\palmeras_imgsz480_epochs200_r0\weights\best.pt, 5.6MB

Validating yolo\yolov8\train\palmeras_imgsz480_epochs200_r0\weights\best.pt...
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
Model summary (fused): 186 layers, 2,684,758 parameters, 0 gradients, 6.8 GFLOPs
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:03<00:00,  1.06s/it]
                   all         70       2670       0.76      0.776      0.791      0.449
                  Tree         43        325      0.592      0.631      0.621      0.291
                  Palm         66       2345      0.929      0.922      0.962      0.608
Speed: 0.4ms preprocess, 2.2ms inference, 0.0ms loss, 5.8ms postprocess per image
Results saved to yolo\yolov8\train\palmeras_imgsz480_epochs200_r0
Out[ ]:
ultralytics.utils.metrics.DetMetrics object with attributes:

ap_class_index: array([0, 1])
box: ultralytics.utils.metrics.Metric object
confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x000001DD1640BC10>
curves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)']
curves_results: [[array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,  0.00057085,  0.00028542,           0],
       [          1,           1,           1, ...,    0.018603,   0.0093015,           0]]), 'Recall', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.051158,    0.051158,    0.055628, ...,           0,           0,           0],
       [    0.37713,     0.37713,     0.41357, ...,           0,           0,           0]]), 'Confidence', 'F1'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.02632,     0.02632,    0.028693, ...,           1,           1,           1],
       [    0.23379,     0.23379,     0.26249, ...,           1,           1,           1]]), 'Confidence', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.90769,     0.90769,     0.90769, ...,           0,           0,           0],
       [    0.97484,     0.97484,     0.97441, ...,           0,           0,           0]]), 'Confidence', 'Recall']]
fitness: 0.4833088152300575
keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']
maps: array([    0.29059,     0.60755])
names: {0: 'Tree', 1: 'Palm'}
plot: True
results_dict: {'metrics/precision(B)': 0.7602786575389766, 'metrics/recall(B)': 0.7761522060029522, 'metrics/mAP50(B)': 0.7914198185930055, 'metrics/mAP50-95(B)': 0.44907425930084105, 'fitness': 0.4833088152300575}
save_dir: WindowsPath('yolo/yolov8/train/palmeras_imgsz480_epochs200_r0')
speed: {'preprocess': 0.38558074406215126, 'inference': 2.200007438659668, 'loss': 0.0, 'postprocess': 5.842811720711844}
task: 'detect'
In [ ]:
# Se entrena el modelo base
model_yolov11_p1.train(project=RUTA_CARPETA_YOLOV11_PROYECTOS_TRAIN, **params_1)
Initial GPU Usage
| ID | GPU | MEM |
------------------
|  0 |  4% | 21% |
GPU Usage after emptying the cache
| ID | GPU | MEM |
------------------
|  0 |  4% | 21% |
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
engine\trainer: task=detect, mode=train, model=yolo11n.pt, data=./resources\utils\yolov8.yaml, epochs=100, time=None, patience=20, batch=16, imgsz=640, save=True, save_period=-1, cache=False, device=cuda, workers=8, project=yolo\yolov11\train, name=palmeras_imgsz640_epochs100_r0, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=4, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, label_smoothing=0.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=yolo\yolov11\train\palmeras_imgsz640_epochs100_r0
Overriding model.yaml nc=80 with nc=2

                   from  n    params  module                                       arguments                     
  0                  -1  1       464  ultralytics.nn.modules.conv.Conv             [3, 16, 3, 2]                 
  1                  -1  1      4672  ultralytics.nn.modules.conv.Conv             [16, 32, 3, 2]                
  2                  -1  1      6640  ultralytics.nn.modules.block.C3k2            [32, 64, 1, False, 0.25]      
  3                  -1  1     36992  ultralytics.nn.modules.conv.Conv             [64, 64, 3, 2]                
  4                  -1  1     26080  ultralytics.nn.modules.block.C3k2            [64, 128, 1, False, 0.25]     
  5                  -1  1    147712  ultralytics.nn.modules.conv.Conv             [128, 128, 3, 2]              
  6                  -1  1     87040  ultralytics.nn.modules.block.C3k2            [128, 128, 1, True]           
  7                  -1  1    295424  ultralytics.nn.modules.conv.Conv             [128, 256, 3, 2]              
  8                  -1  1    346112  ultralytics.nn.modules.block.C3k2            [256, 256, 1, True]           
  9                  -1  1    164608  ultralytics.nn.modules.block.SPPF            [256, 256, 5]                 
 10                  -1  1    249728  ultralytics.nn.modules.block.C2PSA           [256, 256, 1]                 
 11                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 12             [-1, 6]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 13                  -1  1    111296  ultralytics.nn.modules.block.C3k2            [384, 128, 1, False]          
 14                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 15             [-1, 4]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 16                  -1  1     32096  ultralytics.nn.modules.block.C3k2            [256, 64, 1, False]           
 17                  -1  1     36992  ultralytics.nn.modules.conv.Conv             [64, 64, 3, 2]                
 18            [-1, 13]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 19                  -1  1     86720  ultralytics.nn.modules.block.C3k2            [192, 128, 1, False]          
 20                  -1  1    147712  ultralytics.nn.modules.conv.Conv             [128, 128, 3, 2]              
 21            [-1, 10]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 22                  -1  1    378880  ultralytics.nn.modules.block.C3k2            [384, 256, 1, True]           
 23        [16, 19, 22]  1    431062  ultralytics.nn.modules.head.Detect           [2, [64, 128, 256]]           
YOLO11n summary: 319 layers, 2,590,230 parameters, 2,590,214 gradients, 6.4 GFLOPs

Transferred 448/499 items from pretrained weights
TensorBoard: Start with 'tensorboard --logdir yolo\yolov11\train\palmeras_imgsz640_epochs100_r0', view at http://localhost:6006/
Freezing layer 'model.23.dfl.conv.weight'
AMP: running Automatic Mixed Precision (AMP) checks with YOLO11n...
AMP: checks passed 
train: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\train.cache... 279 images, 0 backgrounds, 0 corrupt: 100%|██████████| 279/279 [00:00<?, ?it/s]
val: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\test.cache... 70 images, 0 backgrounds, 0 corrupt: 100%|██████████| 70/70 [00:00<?, ?it/s]
Plotting labels to yolo\yolov11\train\palmeras_imgsz640_epochs100_r0\labels.jpg... 
optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... 
optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 81 weight(decay=0.0), 88 weight(decay=0.0005), 87 bias(decay=0.0)
TensorBoard: model graph visualization added 
Image sizes 640 train, 640 val
Using 8 dataloader workers
Logging results to yolo\yolov11\train\palmeras_imgsz640_epochs100_r0
Starting training for 100 epochs...

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      1/100      3.54G      2.156      3.576      1.681        438        640: 100%|██████████| 18/18 [00:05<00:00,  3.51it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.75it/s]
                   all         70       2670     0.0811      0.548      0.224      0.109

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      2/100      4.03G      1.594       2.52      1.175        528        640: 100%|██████████| 18/18 [00:04<00:00,  4.45it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.24it/s]
                   all         70       2670     0.0819      0.543      0.346      0.181

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      3/100      3.92G      1.561      1.537      1.146        435        640: 100%|██████████| 18/18 [00:04<00:00,  4.48it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.46it/s]
                   all         70       2670      0.932      0.291      0.386      0.207

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      4/100      3.86G      1.439       1.18      1.139        262        640: 100%|██████████| 18/18 [00:04<00:00,  4.39it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.10it/s]
                   all         70       2670      0.944      0.408      0.502      0.284

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      5/100       3.5G      1.402      1.037      1.134        266        640: 100%|██████████| 18/18 [00:03<00:00,  4.58it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.32it/s]
                   all         70       2670      0.529      0.576      0.579      0.317

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      6/100       3.4G      1.395      1.015      1.111        204        640: 100%|██████████| 18/18 [00:03<00:00,  4.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.92it/s]
                   all         70       2670      0.538      0.523      0.532      0.286

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      7/100      3.54G      1.415     0.9629      1.122        408        640: 100%|██████████| 18/18 [00:03<00:00,  4.66it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.13it/s]
                   all         70       2670      0.513      0.592      0.576      0.328

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      8/100      4.81G      1.384     0.8925      1.106        477        640: 100%|██████████| 18/18 [00:03<00:00,  4.50it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.82it/s]
                   all         70       2670      0.569        0.5      0.541      0.303

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      9/100      3.64G      1.348     0.8806      1.103        449        640: 100%|██████████| 18/18 [00:03<00:00,  5.06it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.96it/s]
                   all         70       2670       0.63      0.627      0.622      0.353

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     10/100       4.8G      1.311     0.8499      1.092        383        640: 100%|██████████| 18/18 [00:08<00:00,  2.11it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.05it/s]
                   all         70       2670      0.693      0.669      0.689      0.389

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     11/100      4.03G      1.296     0.8175      1.086        427        640: 100%|██████████| 18/18 [00:03<00:00,  5.17it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.20it/s]
                   all         70       2670      0.653      0.668      0.665      0.359

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     12/100      3.88G      1.312      0.798      1.092        274        640: 100%|██████████| 18/18 [00:03<00:00,  4.84it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.29it/s]
                   all         70       2670      0.695      0.663       0.69      0.385

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     13/100      3.88G      1.346      0.802      1.096        532        640: 100%|██████████| 18/18 [00:03<00:00,  5.08it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.28it/s]
                   all         70       2670      0.706      0.675      0.701       0.38

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     14/100      3.88G      1.335     0.7829       1.09        287        640: 100%|██████████| 18/18 [00:03<00:00,  4.76it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.42it/s]
                   all         70       2670      0.698      0.635      0.693       0.38

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     15/100      3.25G      1.322     0.8042      1.092        386        640: 100%|██████████| 18/18 [00:03<00:00,  5.14it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.07it/s]
                   all         70       2670      0.731      0.671      0.721      0.403

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     16/100      4.14G      1.298     0.7653      1.088        537        640: 100%|██████████| 18/18 [00:03<00:00,  4.93it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.63it/s]
                   all         70       2670      0.723      0.714      0.739      0.402

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     17/100      3.82G      1.332     0.7499      1.092        366        640: 100%|██████████| 18/18 [00:03<00:00,  5.02it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.68it/s]
                   all         70       2670       0.75      0.703      0.743        0.4

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     18/100      3.99G      1.314     0.7607      1.084        476        640: 100%|██████████| 18/18 [00:03<00:00,  4.91it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.89it/s]
                   all         70       2670      0.691      0.732      0.723      0.395

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     19/100      3.44G      1.314     0.7444      1.085        444        640: 100%|██████████| 18/18 [00:03<00:00,  5.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.37it/s]
                   all         70       2670      0.747      0.745      0.763      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     20/100      4.11G      1.276     0.7253      1.079        423        640: 100%|██████████| 18/18 [00:03<00:00,  4.55it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.16it/s]
                   all         70       2670      0.752      0.756      0.779      0.424

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     21/100      4.22G       1.29     0.7077      1.087        379        640: 100%|██████████| 18/18 [00:03<00:00,  5.13it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.19it/s]
                   all         70       2670       0.79      0.712      0.772      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     22/100      4.09G      1.276     0.7106      1.076        634        640: 100%|██████████| 18/18 [00:03<00:00,  4.85it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.66it/s]
                   all         70       2670      0.767      0.756      0.789      0.431

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     23/100      4.43G      1.275     0.6932      1.072        520        640: 100%|██████████| 18/18 [00:03<00:00,  4.94it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.06it/s]
                   all         70       2670      0.752      0.695      0.755      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     24/100      4.71G      1.236     0.6957      1.067        351        640: 100%|██████████| 18/18 [00:03<00:00,  4.90it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.03it/s]
                   all         70       2670      0.756      0.711      0.755      0.417

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     25/100      3.35G      1.238      0.699      1.072        454        640: 100%|██████████| 18/18 [00:03<00:00,  5.04it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.98it/s]
                   all         70       2670      0.772      0.757      0.792      0.426

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     26/100      4.29G      1.262     0.6855       1.06        441        640: 100%|██████████| 18/18 [00:03<00:00,  4.67it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.90it/s]
                   all         70       2670      0.765      0.733       0.78      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     27/100      3.52G      1.252     0.6901       1.06        538        640: 100%|██████████| 18/18 [00:03<00:00,  5.03it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.08it/s]
                   all         70       2670      0.724      0.772      0.768      0.438

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     28/100      4.43G      1.244      0.679      1.066        373        640: 100%|██████████| 18/18 [00:03<00:00,  4.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.74it/s]
                   all         70       2670      0.743      0.742      0.755       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     29/100      3.64G      1.237     0.6636      1.053        387        640: 100%|██████████| 18/18 [00:03<00:00,  4.61it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.25it/s]
                   all         70       2670      0.756      0.771      0.786      0.442

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     30/100      4.09G      1.254     0.6769      1.056        568        640: 100%|██████████| 18/18 [00:04<00:00,  4.48it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.46it/s]
                   all         70       2670      0.767      0.772      0.795      0.433

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     31/100      3.93G      1.226     0.6508      1.052        272        640: 100%|██████████| 18/18 [00:03<00:00,  4.70it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.08it/s]
                   all         70       2670      0.784      0.758      0.821      0.446

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     32/100      4.07G      1.233     0.6573      1.068        362        640: 100%|██████████| 18/18 [00:03<00:00,  4.82it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.87it/s]
                   all         70       2670       0.76      0.757      0.792      0.439

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     33/100      3.91G      1.213     0.6536      1.056        374        640: 100%|██████████| 18/18 [00:03<00:00,  5.22it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.85it/s]
                   all         70       2670       0.78      0.774      0.809      0.454

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     34/100      4.42G      1.269     0.6727      1.061        434        640: 100%|██████████| 18/18 [00:04<00:00,  4.48it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.16it/s]
                   all         70       2670      0.782      0.753      0.797      0.452

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     35/100      4.35G      1.239     0.6634      1.046        524        640: 100%|██████████| 18/18 [00:04<00:00,  4.48it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.32it/s]
                   all         70       2670      0.752       0.76      0.801      0.447

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     36/100      4.01G      1.247     0.6559      1.059        306        640: 100%|██████████| 18/18 [00:03<00:00,  4.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.99it/s]
                   all         70       2670      0.763      0.734      0.763      0.426

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     37/100      3.46G      1.242     0.6598      1.063        322        640: 100%|██████████| 18/18 [00:03<00:00,  5.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.17it/s]
                   all         70       2670      0.731      0.735      0.762       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     38/100      4.08G      1.241      0.648      1.058        347        640: 100%|██████████| 18/18 [00:03<00:00,  4.79it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.87it/s]
                   all         70       2670      0.724      0.728      0.749       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     39/100      4.13G      1.227     0.6396      1.055        438        640: 100%|██████████| 18/18 [00:04<00:00,  4.44it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.81it/s]
                   all         70       2670      0.775      0.755      0.796      0.451

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     40/100      4.33G      1.214      0.625      1.056        433        640: 100%|██████████| 18/18 [00:03<00:00,  4.63it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.26it/s]
                   all         70       2670      0.799      0.756      0.814      0.462

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     41/100      4.09G      1.205     0.6276      1.045        384        640: 100%|██████████| 18/18 [00:04<00:00,  4.46it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.32it/s]
                   all         70       2670      0.809      0.754      0.813      0.466

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     42/100      3.64G      1.227     0.6375      1.059        343        640: 100%|██████████| 18/18 [00:03<00:00,  4.91it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.49it/s]
                   all         70       2670      0.784      0.749      0.805      0.457

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     43/100      4.06G      1.223     0.6381      1.057        419        640: 100%|██████████| 18/18 [00:03<00:00,  4.73it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.11it/s]
                   all         70       2670      0.783      0.742      0.795      0.445

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     44/100      3.78G       1.22     0.6277      1.045        620        640: 100%|██████████| 18/18 [00:03<00:00,  4.85it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.59it/s]
                   all         70       2670      0.761      0.734      0.766      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     45/100      3.59G      1.179     0.6055      1.036        497        640: 100%|██████████| 18/18 [00:03<00:00,  5.04it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.99it/s]
                   all         70       2670      0.774      0.756      0.792      0.438

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     46/100      3.82G       1.21     0.6219      1.044        539        640: 100%|██████████| 18/18 [00:03<00:00,  4.88it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.85it/s]
                   all         70       2670       0.78      0.761       0.81      0.445

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     47/100      3.24G      1.192     0.6127      1.045        293        640: 100%|██████████| 18/18 [00:03<00:00,  4.82it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.15it/s]
                   all         70       2670      0.818      0.749      0.819      0.455

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     48/100      4.55G      1.176     0.6053       1.04        352        640: 100%|██████████| 18/18 [00:03<00:00,  4.64it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.69it/s]
                   all         70       2670       0.83      0.747      0.824       0.46

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     49/100      3.62G       1.19     0.6006      1.044        541        640: 100%|██████████| 18/18 [00:03<00:00,  4.85it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.33it/s]
                   all         70       2670      0.806      0.773      0.829      0.466

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     50/100      4.24G      1.227     0.6211      1.052        407        640: 100%|██████████| 18/18 [00:03<00:00,  4.76it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.06it/s]
                   all         70       2670      0.815      0.754      0.811      0.454

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     51/100      3.57G      1.194     0.5965       1.05        496        640: 100%|██████████| 18/18 [00:03<00:00,  4.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.65it/s]
                   all         70       2670      0.812      0.778      0.826      0.458

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     52/100      4.07G      1.195     0.5946      1.037        465        640: 100%|██████████| 18/18 [00:03<00:00,  5.02it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.54it/s]
                   all         70       2670      0.817      0.762      0.823      0.463

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     53/100      4.12G      1.193     0.5904      1.043        553        640: 100%|██████████| 18/18 [00:03<00:00,  4.92it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.71it/s]
                   all         70       2670      0.785      0.784       0.82      0.466

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     54/100      4.29G      1.163     0.5937      1.041        449        640: 100%|██████████| 18/18 [00:03<00:00,  4.88it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.13it/s]
                   all         70       2670      0.799      0.748      0.806      0.456

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     55/100      3.79G      1.174     0.5874      1.042        262        640: 100%|██████████| 18/18 [00:03<00:00,  4.61it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.80it/s]
                   all         70       2670      0.785      0.761      0.791      0.444

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     56/100      3.85G      1.172     0.5826       1.03        475        640: 100%|██████████| 18/18 [00:04<00:00,  4.40it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.46it/s]
                   all         70       2670      0.798      0.797       0.82      0.453

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     57/100      4.27G      1.176      0.596      1.049        252        640: 100%|██████████| 18/18 [00:03<00:00,  4.72it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.66it/s]
                   all         70       2670      0.775      0.801      0.818      0.448

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     58/100       4.4G      1.179     0.5848      1.037        550        640: 100%|██████████| 18/18 [00:04<00:00,  4.47it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.28it/s]
                   all         70       2670      0.798      0.785      0.824      0.452

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     59/100      4.89G      1.167     0.5913      1.034        315        640: 100%|██████████| 18/18 [01:30<00:00,  5.03s/it]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:59<00:00, 19.85s/it]
                   all         70       2670      0.822       0.78      0.817       0.46

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     60/100      3.66G      1.167     0.5836      1.038        474        640: 100%|██████████| 18/18 [00:10<00:00,  1.77it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.67it/s]
                   all         70       2670      0.783      0.769      0.819      0.464

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     61/100       4.1G      1.169     0.5776      1.019        549        640: 100%|██████████| 18/18 [00:03<00:00,  4.72it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.97it/s]
                   all         70       2670      0.822      0.754       0.81      0.458

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     62/100      4.25G      1.163     0.5754      1.038        360        640: 100%|██████████| 18/18 [00:03<00:00,  4.90it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.44it/s]
                   all         70       2670      0.824      0.771      0.823      0.454

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     63/100      4.19G      1.164     0.5683      1.031        473        640: 100%|██████████| 18/18 [00:03<00:00,  5.05it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.33it/s]
                   all         70       2670       0.81      0.735      0.799      0.444

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     64/100      4.82G      1.184     0.5701      1.031        395        640: 100%|██████████| 18/18 [00:04<00:00,  4.08it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.83it/s]
                   all         70       2670      0.774      0.796      0.813      0.454

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     65/100      3.62G      1.167     0.5666      1.033        511        640: 100%|██████████| 18/18 [00:03<00:00,  4.63it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.24it/s]
                   all         70       2670      0.802      0.792      0.833      0.457

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     66/100      3.68G       1.18     0.5798       1.04        411        640: 100%|██████████| 18/18 [00:04<00:00,  4.39it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.27it/s]
                   all         70       2670      0.838      0.784      0.831      0.463

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     67/100      3.97G      1.132     0.5516      1.034        449        640: 100%|██████████| 18/18 [00:03<00:00,  4.57it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.78it/s]
                   all         70       2670      0.818      0.787      0.828      0.456

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     68/100      4.84G      1.189      0.582      1.039        394        640: 100%|██████████| 18/18 [00:03<00:00,  4.53it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:05<00:00,  1.84s/it]
                   all         70       2670      0.821      0.775      0.825      0.457

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     69/100      3.91G      1.169     0.5716      1.034        424        640: 100%|██████████| 18/18 [00:03<00:00,  4.89it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.35it/s]
                   all         70       2670      0.814       0.78      0.827      0.461
EarlyStopping: Training stopped early as no improvement observed in last 20 epochs. Best results observed at epoch 49, best model saved as best.pt.
To update EarlyStopping(patience=20) pass a new patience value, i.e. `patience=300` or use `patience=0` to disable EarlyStopping.

69 epochs completed in 0.166 hours.
Optimizer stripped from yolo\yolov11\train\palmeras_imgsz640_epochs100_r0\weights\last.pt, 5.5MB
Optimizer stripped from yolo\yolov11\train\palmeras_imgsz640_epochs100_r0\weights\best.pt, 5.5MB

Validating yolo\yolov11\train\palmeras_imgsz640_epochs100_r0\weights\best.pt...
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
YOLO11n summary (fused): 238 layers, 2,582,542 parameters, 0 gradients, 6.3 GFLOPs
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:03<00:00,  1.07s/it]
                   all         70       2670       0.81      0.773      0.832      0.466
                  Tree         43        325      0.665       0.64      0.695      0.322
                  Palm         66       2345      0.955      0.907      0.969       0.61
Speed: 0.4ms preprocess, 4.0ms inference, 0.0ms loss, 2.2ms postprocess per image
Results saved to yolo\yolov11\train\palmeras_imgsz640_epochs100_r0
Out[ ]:
ultralytics.utils.metrics.DetMetrics object with attributes:

ap_class_index: array([0, 1])
box: ultralytics.utils.metrics.Metric object
confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x000001278B603450>
curves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)']
curves_results: [[array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,   0.0015147,  0.00075737,           0],
       [          1,           1,           1, ...,    0.025657,    0.012828,           0]]), 'Recall', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.063038,    0.063038,    0.075066, ...,           0,           0,           0],
       [    0.33549,     0.33549,     0.41381, ...,           0,           0,           0]]), 'Confidence', 'F1'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.032593,    0.032593,    0.039065, ...,           1,           1,           1],
       [    0.20221,     0.20221,     0.26201, ...,           1,           1,           1]]), 'Confidence', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.95692,     0.95692,     0.95692, ...,           0,           0,           0],
       [    0.98422,     0.98422,      0.9838, ...,           0,           0,           0]]), 'Confidence', 'Recall']]
fitness: 0.502632982904651
keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']
maps: array([    0.32175,     0.61033])
names: {0: 'Tree', 1: 'Palm'}
plot: True
results_dict: {'metrics/precision(B)': 0.8100651344029258, 'metrics/recall(B)': 0.7733291820605253, 'metrics/mAP50(B)': 0.8319734423302556, 'metrics/mAP50-95(B)': 0.46603959852402826, 'fitness': 0.502632982904651}
save_dir: WindowsPath('yolo/yolov11/train/palmeras_imgsz640_epochs100_r0')
speed: {'preprocess': 0.3856863294328962, 'inference': 4.02989387512207, 'loss': 0.0, 'postprocess': 2.171393803187779}
task: 'detect'
In [ ]:
# Se entrena el modelo base
model_yolov11_p2.train(project=RUTA_CARPETA_YOLOV11_PROYECTOS_TRAIN, **params_2)
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
engine\trainer: task=detect, mode=train, model=yolo11n.pt, data=./resources\utils\yolov8.yaml, epochs=200, time=None, patience=20, batch=16, imgsz=480, save=True, save_period=-1, cache=False, device=cuda, workers=8, project=yolo\yolov11\train, name=palmeras_imgsz480_epochs200_r0, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=4, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, label_smoothing=0.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=yolo\yolov11\train\palmeras_imgsz480_epochs200_r0
Overriding model.yaml nc=80 with nc=2

                   from  n    params  module                                       arguments                     
  0                  -1  1       464  ultralytics.nn.modules.conv.Conv             [3, 16, 3, 2]                 
  1                  -1  1      4672  ultralytics.nn.modules.conv.Conv             [16, 32, 3, 2]                
  2                  -1  1      6640  ultralytics.nn.modules.block.C3k2            [32, 64, 1, False, 0.25]      
  3                  -1  1     36992  ultralytics.nn.modules.conv.Conv             [64, 64, 3, 2]                
  4                  -1  1     26080  ultralytics.nn.modules.block.C3k2            [64, 128, 1, False, 0.25]     
  5                  -1  1    147712  ultralytics.nn.modules.conv.Conv             [128, 128, 3, 2]              
  6                  -1  1     87040  ultralytics.nn.modules.block.C3k2            [128, 128, 1, True]           
  7                  -1  1    295424  ultralytics.nn.modules.conv.Conv             [128, 256, 3, 2]              
  8                  -1  1    346112  ultralytics.nn.modules.block.C3k2            [256, 256, 1, True]           
  9                  -1  1    164608  ultralytics.nn.modules.block.SPPF            [256, 256, 5]                 
 10                  -1  1    249728  ultralytics.nn.modules.block.C2PSA           [256, 256, 1]                 
 11                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 12             [-1, 6]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 13                  -1  1    111296  ultralytics.nn.modules.block.C3k2            [384, 128, 1, False]          
 14                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
 15             [-1, 4]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 16                  -1  1     32096  ultralytics.nn.modules.block.C3k2            [256, 64, 1, False]           
 17                  -1  1     36992  ultralytics.nn.modules.conv.Conv             [64, 64, 3, 2]                
 18            [-1, 13]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 19                  -1  1     86720  ultralytics.nn.modules.block.C3k2            [192, 128, 1, False]          
 20                  -1  1    147712  ultralytics.nn.modules.conv.Conv             [128, 128, 3, 2]              
 21            [-1, 10]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
 22                  -1  1    378880  ultralytics.nn.modules.block.C3k2            [384, 256, 1, True]           
 23        [16, 19, 22]  1    431062  ultralytics.nn.modules.head.Detect           [2, [64, 128, 256]]           
YOLO11n summary: 319 layers, 2,590,230 parameters, 2,590,214 gradients, 6.4 GFLOPs

Transferred 448/499 items from pretrained weights
TensorBoard: Start with 'tensorboard --logdir yolo\yolov11\train\palmeras_imgsz480_epochs200_r0', view at http://localhost:6006/
Freezing layer 'model.23.dfl.conv.weight'
AMP: running Automatic Mixed Precision (AMP) checks with YOLO11n...
AMP: checks passed 
train: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\train.cache... 279 images, 0 backgrounds, 0 corrupt: 100%|██████████| 279/279 [00:00<?, ?it/s]
val: Scanning D:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\datasets\raw\palmeras_kaggle_yolo_format\test.cache... 70 images, 0 backgrounds, 0 corrupt: 100%|██████████| 70/70 [00:00<?, ?it/s]
Plotting labels to yolo\yolov11\train\palmeras_imgsz480_epochs200_r0\labels.jpg... 
optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... 
optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 81 weight(decay=0.0), 88 weight(decay=0.0005), 87 bias(decay=0.0)
TensorBoard: model graph visualization added 
Image sizes 480 train, 480 val
Using 8 dataloader workers
Logging results to yolo\yolov11\train\palmeras_imgsz480_epochs200_r0
Starting training for 200 epochs...

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      1/200      2.07G       2.46       3.72      1.711        439        480: 100%|██████████| 18/18 [00:04<00:00,  4.45it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.25it/s]
                   all         70       2670       0.06      0.442     0.0587     0.0289

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      2/200      2.35G      1.689      2.877      1.162        527        480: 100%|██████████| 18/18 [00:04<00:00,  4.12it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.39it/s]
                   all         70       2670     0.0834      0.555      0.316      0.151

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      3/200       2.3G      1.676       1.89      1.106        435        480: 100%|██████████| 18/18 [00:04<00:00,  4.40it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.54it/s]
                   all         70       2670       0.96      0.218      0.387      0.203

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      4/200      2.26G      1.566      1.345      1.087        262        480: 100%|██████████| 18/18 [00:03<00:00,  4.70it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:02<00:00,  1.42it/s]
                   all         70       2670      0.914      0.419      0.441      0.238

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      5/200      2.06G       1.47      1.125      1.077        265        480: 100%|██████████| 18/18 [00:03<00:00,  5.48it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.91it/s]
                   all         70       2670      0.823      0.378      0.381        0.2

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      6/200         2G      1.469      1.085      1.066        204        480: 100%|██████████| 18/18 [00:03<00:00,  5.80it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.07it/s]
                   all         70       2670      0.904      0.401      0.445      0.251

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      7/200      2.08G      1.434     0.9941      1.059        408        480: 100%|██████████| 18/18 [00:04<00:00,  4.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.63it/s]
                   all         70       2670      0.929      0.437       0.51      0.294

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      8/200       2.8G        1.4     0.9392      1.046        477        480: 100%|██████████| 18/18 [00:03<00:00,  4.83it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.97it/s]
                   all         70       2670      0.742      0.519      0.596      0.326

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
      9/200      2.14G      1.406     0.9041      1.062        449        480: 100%|██████████| 18/18 [00:03<00:00,  5.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.52it/s]
                   all         70       2670      0.734      0.451      0.547       0.31

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     10/200      2.78G       1.36     0.8727      1.046        383        480: 100%|██████████| 18/18 [00:03<00:00,  5.87it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.00it/s]
                   all         70       2670       0.65      0.547      0.599      0.325

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     11/200      2.36G      1.353     0.8352      1.049        426        480: 100%|██████████| 18/18 [00:02<00:00,  6.08it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.60it/s]
                   all         70       2670      0.596      0.621      0.619      0.334

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     12/200      2.27G      1.353     0.8165      1.042        274        480: 100%|██████████| 18/18 [00:03<00:00,  5.75it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.74it/s]
                   all         70       2670      0.578      0.599      0.601      0.316

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     13/200      2.26G      1.395     0.8096      1.054        531        480: 100%|██████████| 18/18 [00:02<00:00,  6.15it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:02<00:00,  1.45it/s]
                   all         70       2670      0.607       0.64      0.629      0.325

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     14/200      2.27G       1.39     0.7974      1.045        287        480: 100%|██████████| 18/18 [00:02<00:00,  6.06it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.51it/s]
                   all         70       2670      0.641      0.623      0.654       0.36

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     15/200      1.92G      1.385      0.811      1.053        382        480: 100%|██████████| 18/18 [00:03<00:00,  5.94it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.31it/s]
                   all         70       2670      0.639      0.643      0.658      0.363

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     16/200      2.42G      1.324      0.773      1.041        540        480: 100%|██████████| 18/18 [00:03<00:00,  5.74it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.06it/s]
                   all         70       2670      0.708      0.636      0.679      0.366

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     17/200      2.23G      1.343     0.7413      1.033        366        480: 100%|██████████| 18/18 [00:03<00:00,  5.81it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.27it/s]
                   all         70       2670       0.75      0.614      0.671      0.369

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     18/200      2.34G      1.328     0.7401      1.034        476        480: 100%|██████████| 18/18 [00:03<00:00,  5.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.24it/s]
                   all         70       2670      0.727      0.683      0.705      0.394

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     19/200      2.02G      1.342     0.7514      1.034        444        480: 100%|██████████| 18/18 [00:03<00:00,  5.46it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.09it/s]
                   all         70       2670       0.66      0.668      0.668      0.373

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     20/200       2.4G      1.307     0.7177      1.035        423        480: 100%|██████████| 18/18 [00:04<00:00,  4.44it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.70it/s]
                   all         70       2670      0.744      0.647        0.7      0.382

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     21/200      2.46G      1.344     0.7206      1.042        379        480: 100%|██████████| 18/18 [00:02<00:00,  6.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.20it/s]
                   all         70       2670      0.756      0.672      0.725      0.392

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     22/200      2.39G      1.318     0.7155      1.032        632        480: 100%|██████████| 18/18 [00:03<00:00,  5.56it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.34it/s]
                   all         70       2670      0.766      0.666      0.733      0.394

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     23/200      2.58G      1.314     0.7059      1.024        520        480: 100%|██████████| 18/18 [00:02<00:00,  6.35it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.29it/s]
                   all         70       2670      0.698      0.664      0.695      0.377

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     24/200      2.73G      1.273     0.7097      1.028        351        480: 100%|██████████| 18/18 [00:02<00:00,  6.26it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.11it/s]
                   all         70       2670      0.749      0.669      0.708      0.389

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     25/200      1.98G       1.28     0.7076      1.032        455        480: 100%|██████████| 18/18 [00:03<00:00,  5.82it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.18it/s]
                   all         70       2670      0.752      0.699      0.738      0.407

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     26/200      2.51G      1.301     0.7059      1.022        440        480: 100%|██████████| 18/18 [00:03<00:00,  5.49it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.11it/s]
                   all         70       2670       0.71      0.658      0.684      0.375

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     27/200      2.07G      1.296     0.7095      1.023        538        480: 100%|██████████| 18/18 [00:03<00:00,  5.55it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.86it/s]
                   all         70       2670      0.717      0.708      0.719        0.4

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     28/200      2.58G      1.286     0.6894      1.032        372        480: 100%|██████████| 18/18 [00:03<00:00,  5.84it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.47it/s]
                   all         70       2670       0.74      0.662      0.701       0.38

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     29/200      2.14G      1.277     0.6832      1.016        387        480: 100%|██████████| 18/18 [00:03<00:00,  5.66it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.02it/s]
                   all         70       2670      0.745      0.699      0.725      0.394

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     30/200      2.39G       1.29     0.6836      1.017        568        480: 100%|██████████| 18/18 [00:03<00:00,  6.00it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.88it/s]
                   all         70       2670      0.669      0.632      0.646      0.361

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     31/200       2.3G      1.269     0.6691      1.014        272        480: 100%|██████████| 18/18 [00:03<00:00,  5.80it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.71it/s]
                   all         70       2670      0.793      0.676      0.754      0.403

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     32/200      2.38G      1.272     0.6775      1.026        363        480: 100%|██████████| 18/18 [00:02<00:00,  6.04it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.33it/s]
                   all         70       2670      0.733      0.708      0.726      0.395

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     33/200      2.29G      1.249     0.6647      1.016        375        480: 100%|██████████| 18/18 [00:03<00:00,  5.07it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.33it/s]
                   all         70       2670       0.78      0.693      0.745      0.406

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     34/200      2.58G      1.308      0.688      1.016        434        480: 100%|██████████| 18/18 [00:03<00:00,  5.56it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.66it/s]
                   all         70       2670      0.816      0.699      0.761      0.414

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     35/200      2.54G      1.288     0.6739      1.012        523        480: 100%|██████████| 18/18 [00:03<00:00,  4.57it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.94it/s]
                   all         70       2670      0.717      0.663      0.708      0.392

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     36/200      2.34G      1.308     0.6644      1.021        306        480: 100%|██████████| 18/18 [00:04<00:00,  4.12it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.13it/s]
                   all         70       2670      0.727       0.68      0.708       0.39

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     37/200      2.04G      1.287     0.6648      1.024        321        480: 100%|██████████| 18/18 [00:02<00:00,  6.06it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.00it/s]
                   all         70       2670       0.69      0.693      0.714      0.397

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     38/200      2.38G      1.299     0.6641      1.023        347        480: 100%|██████████| 18/18 [00:03<00:00,  5.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.69it/s]
                   all         70       2670      0.748      0.719      0.756      0.416

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     39/200      2.41G      1.275     0.6614      1.022        439        480: 100%|██████████| 18/18 [00:03<00:00,  5.28it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.89it/s]
                   all         70       2670      0.757      0.722      0.761      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     40/200      2.53G      1.258     0.6453      1.017        433        480: 100%|██████████| 18/18 [00:03<00:00,  5.57it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.35it/s]
                   all         70       2670      0.741      0.713      0.755      0.415

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     41/200      2.39G      1.242     0.6385      1.009        384        480: 100%|██████████| 18/18 [00:03<00:00,  5.80it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.92it/s]
                   all         70       2670      0.737       0.74       0.76      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     42/200      2.36G      1.273     0.6623      1.023        342        480: 100%|██████████| 18/18 [00:03<00:00,  4.76it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.61it/s]
                   all         70       2670      0.743      0.704      0.741      0.408

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     43/200      2.38G      1.263     0.6553       1.02        419        480: 100%|██████████| 18/18 [00:03<00:00,  5.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.77it/s]
                   all         70       2670       0.75      0.708      0.747      0.414

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     44/200      2.21G      1.266     0.6465      1.004        619        480: 100%|██████████| 18/18 [00:03<00:00,  5.53it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.32it/s]
                   all         70       2670      0.757      0.726       0.75       0.42

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     45/200      2.11G      1.237     0.6268      1.002        497        480: 100%|██████████| 18/18 [00:03<00:00,  5.41it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.63it/s]
                   all         70       2670      0.764      0.739      0.757      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     46/200      2.24G      1.262     0.6383      1.009        537        480: 100%|██████████| 18/18 [00:03<00:00,  5.69it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.06it/s]
                   all         70       2670      0.709      0.736       0.75      0.412

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     47/200      1.91G       1.25     0.6289      1.014        293        480: 100%|██████████| 18/18 [00:02<00:00,  6.01it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.90it/s]
                   all         70       2670      0.781      0.691      0.755      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     48/200      2.65G      1.231     0.6261      1.008        353        480: 100%|██████████| 18/18 [00:03<00:00,  5.09it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.00it/s]
                   all         70       2670      0.757       0.73      0.766      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     49/200      2.12G      1.236     0.6249       1.01        540        480: 100%|██████████| 18/18 [00:02<00:00,  6.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.27it/s]
                   all         70       2670      0.741      0.745      0.757      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     50/200      2.47G      1.273      0.641      1.015        407        480: 100%|██████████| 18/18 [00:03<00:00,  4.72it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.73it/s]
                   all         70       2670      0.785      0.689      0.759      0.416

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     51/200       2.1G      1.238     0.6191      1.013        495        480: 100%|██████████| 18/18 [00:02<00:00,  6.39it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.67it/s]
                   all         70       2670      0.747      0.755      0.771      0.424

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     52/200      2.38G       1.25     0.6211      1.003        466        480: 100%|██████████| 18/18 [00:03<00:00,  6.00it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.13it/s]
                   all         70       2670      0.735      0.717      0.754      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     53/200      2.41G      1.244     0.6201      1.008        553        480: 100%|██████████| 18/18 [00:03<00:00,  5.85it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.46it/s]
                   all         70       2670      0.782      0.729      0.768      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     54/200      2.51G      1.216     0.6192      1.006        449        480: 100%|██████████| 18/18 [00:03<00:00,  5.90it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.63it/s]
                   all         70       2670      0.779      0.695      0.755      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     55/200      2.22G      1.232     0.6172      1.007        261        480: 100%|██████████| 18/18 [00:03<00:00,  5.10it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.71it/s]
                   all         70       2670      0.779       0.73      0.766      0.423

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     56/200      2.26G      1.221     0.6075     0.9929        475        480: 100%|██████████| 18/18 [00:03<00:00,  4.55it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.38it/s]
                   all         70       2670      0.789       0.72      0.771      0.433

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     57/200      2.49G      1.227     0.6234      1.017        252        480: 100%|██████████| 18/18 [00:03<00:00,  5.05it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.99it/s]
                   all         70       2670      0.709      0.732      0.748      0.412

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     58/200      2.57G      1.229     0.6175      1.004        549        480: 100%|██████████| 18/18 [00:03<00:00,  5.63it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.49it/s]
                   all         70       2670      0.803      0.732      0.782       0.43

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     59/200      2.91G      1.226     0.6269      1.006        315        480: 100%|██████████| 18/18 [00:03<00:00,  5.29it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.27it/s]
                   all         70       2670      0.715      0.734      0.744      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     60/200      2.16G      1.223      0.616      1.005        475        480: 100%|██████████| 18/18 [00:03<00:00,  4.97it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.58it/s]
                   all         70       2670      0.699      0.687      0.723      0.405

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     61/200       2.4G       1.24      0.613      0.993        547        480: 100%|██████████| 18/18 [00:02<00:00,  6.01it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  1.94it/s]
                   all         70       2670      0.724      0.759      0.746      0.417

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     62/200      2.48G      1.235     0.6093      1.012        360        480: 100%|██████████| 18/18 [00:03<00:00,  5.46it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.53it/s]
                   all         70       2670      0.795      0.703      0.759      0.411

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     63/200      2.45G      1.228      0.602      1.003        473        480: 100%|██████████| 18/18 [00:03<00:00,  5.56it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.02it/s]
                   all         70       2670       0.76      0.739      0.753      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     64/200       2.8G      1.234     0.5995     0.9971        395        480: 100%|██████████| 18/18 [00:03<00:00,  5.70it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.65it/s]
                   all         70       2670      0.781      0.708      0.757      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     65/200      2.12G      1.212     0.6018      0.996        511        480: 100%|██████████| 18/18 [00:03<00:00,  5.42it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.28it/s]
                   all         70       2670      0.769      0.779      0.796      0.437

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     66/200      2.16G      1.228     0.6085      1.004        411        480: 100%|██████████| 18/18 [00:03<00:00,  5.56it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.76it/s]
                   all         70       2670       0.78      0.745      0.774      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     67/200      2.33G      1.189     0.5907      1.002        449        480: 100%|██████████| 18/18 [00:03<00:00,  4.97it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.28it/s]
                   all         70       2670      0.806      0.724      0.789      0.436

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     68/200      2.82G      1.245     0.6162       1.01        392        480: 100%|██████████| 18/18 [00:03<00:00,  5.32it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.08it/s]
                   all         70       2670      0.737      0.752      0.757      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     69/200      2.29G      1.228     0.6076      1.004        425        480: 100%|██████████| 18/18 [00:03<00:00,  5.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.33it/s]
                   all         70       2670      0.772      0.748      0.775      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     70/200      2.29G      1.199     0.5883      1.004        306        480: 100%|██████████| 18/18 [00:03<00:00,  5.16it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.92it/s]
                   all         70       2670      0.769      0.746      0.768      0.422

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     71/200      2.11G      1.186     0.5769     0.9964        504        480: 100%|██████████| 18/18 [00:02<00:00,  6.01it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.11it/s]
                   all         70       2670      0.802      0.735      0.783      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     72/200      2.51G      1.245     0.6108      1.017        321        480: 100%|██████████| 18/18 [00:04<00:00,  4.19it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.63it/s]
                   all         70       2670      0.755      0.738       0.77      0.425

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     73/200      2.46G       1.21     0.5892     0.9962        497        480: 100%|██████████| 18/18 [00:03<00:00,  5.87it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.17it/s]
                   all         70       2670      0.773      0.752       0.79      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     74/200       2.4G      1.178     0.5766     0.9848        548        480: 100%|██████████| 18/18 [00:03<00:00,  5.04it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.36it/s]
                   all         70       2670      0.798      0.736      0.777      0.436

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     75/200      2.16G      1.199     0.5856     0.9868        667        480: 100%|██████████| 18/18 [00:03<00:00,  5.64it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.85it/s]
                   all         70       2670      0.758      0.746      0.765      0.421

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     76/200      2.32G      1.214     0.5875     0.9977        537        480: 100%|██████████| 18/18 [00:03<00:00,  5.31it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.50it/s]
                   all         70       2670      0.777      0.746       0.77      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     77/200         2G      1.198     0.5794     0.9986        468        480: 100%|██████████| 18/18 [00:04<00:00,  4.03it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.78it/s]
                   all         70       2670      0.805      0.737      0.776      0.444

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     78/200      2.59G      1.197      0.578     0.9873        520        480: 100%|██████████| 18/18 [00:03<00:00,  5.96it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.52it/s]
                   all         70       2670      0.767      0.752      0.788      0.439

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     79/200      2.32G      1.181     0.5761     0.9916        649        480: 100%|██████████| 18/18 [00:03<00:00,  5.45it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.53it/s]
                   all         70       2670      0.799      0.731      0.785      0.437

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     80/200      1.95G      1.202     0.5827      1.005        302        480: 100%|██████████| 18/18 [00:03<00:00,  5.47it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.22it/s]
                   all         70       2670      0.798        0.7      0.755      0.425

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     81/200      2.17G      1.182       0.57     0.9914        372        480: 100%|██████████| 18/18 [00:03<00:00,  4.97it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.75it/s]
                   all         70       2670      0.778      0.739      0.785      0.444

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     82/200      2.75G      1.188     0.5699      1.004        436        480: 100%|██████████| 18/18 [00:03<00:00,  5.96it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.21it/s]
                   all         70       2670      0.784      0.752      0.776      0.439

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     83/200      2.24G      1.197     0.5812     0.9939        445        480: 100%|██████████| 18/18 [00:02<00:00,  6.06it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.62it/s]
                   all         70       2670      0.756      0.766      0.773       0.43

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     84/200      2.09G      1.188     0.5759     0.9929        717        480: 100%|██████████| 18/18 [00:04<00:00,  3.77it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.74it/s]
                   all         70       2670      0.792      0.706      0.773      0.428

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     85/200      2.08G      1.203     0.5738     0.9909        493        480: 100%|██████████| 18/18 [00:03<00:00,  4.71it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.35it/s]
                   all         70       2670      0.752      0.769      0.781      0.426

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     86/200      2.44G      1.195     0.5777     0.9934        366        480: 100%|██████████| 18/18 [00:02<00:00,  6.15it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.22it/s]
                   all         70       2670      0.791      0.751      0.783      0.436

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     87/200      2.12G      1.188     0.5754     0.9953        361        480: 100%|██████████| 18/18 [00:04<00:00,  4.24it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.33it/s]
                   all         70       2670      0.764      0.756      0.768      0.424

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     88/200      2.68G      1.195     0.5691     0.9957        476        480: 100%|██████████| 18/18 [00:03<00:00,  5.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.75it/s]
                   all         70       2670      0.764      0.763      0.791      0.437

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     89/200       2.6G      1.196     0.5738      1.002        277        480: 100%|██████████| 18/18 [00:04<00:00,  4.44it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  4.25it/s]
                   all         70       2670      0.777      0.782      0.799      0.441

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     90/200      2.18G      1.186     0.5644      0.992        371        480: 100%|██████████| 18/18 [00:03<00:00,  5.90it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.12it/s]
                   all         70       2670      0.764      0.775      0.784      0.425

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     91/200      2.43G      1.176     0.5705     0.9855        290        480: 100%|██████████| 18/18 [00:03<00:00,  5.63it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.12it/s]
                   all         70       2670      0.761      0.758      0.771      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     92/200      2.44G      1.173     0.5638     0.9946        346        480: 100%|██████████| 18/18 [00:03<00:00,  4.73it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.64it/s]
                   all         70       2670       0.79      0.746      0.781      0.438

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     93/200      2.66G      1.187     0.5646     0.9887        415        480: 100%|██████████| 18/18 [00:03<00:00,  5.67it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.74it/s]
                   all         70       2670       0.73      0.762      0.769      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     94/200      2.26G      1.188     0.5675      0.986        389        480: 100%|██████████| 18/18 [00:03<00:00,  5.52it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.06it/s]
                   all         70       2670       0.78      0.738      0.779      0.434

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     95/200      2.08G      1.171     0.5587     0.9891        494        480: 100%|██████████| 18/18 [00:03<00:00,  5.76it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.99it/s]
                   all         70       2670      0.739       0.75       0.76      0.419

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     96/200      2.82G      1.176     0.5728     0.9902        338        480: 100%|██████████| 18/18 [00:03<00:00,  5.80it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.48it/s]
                   all         70       2670      0.745      0.757      0.771       0.43

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     97/200      2.23G      1.183      0.563     0.9885        445        480: 100%|██████████| 18/18 [00:03<00:00,  4.98it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.35it/s]
                   all         70       2670      0.758      0.754      0.774      0.427

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     98/200      2.45G      1.161     0.5566      0.982        367        480: 100%|██████████| 18/18 [00:03<00:00,  5.83it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.08it/s]
                   all         70       2670      0.757      0.773      0.776      0.429

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
     99/200      2.11G      1.172     0.5582     0.9854        424        480: 100%|██████████| 18/18 [00:03<00:00,  5.79it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.85it/s]
                   all         70       2670      0.742      0.787      0.772      0.433

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
    100/200      2.41G      1.173       0.56     0.9857        310        480: 100%|██████████| 18/18 [00:03<00:00,  5.78it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:01<00:00,  2.91it/s]
                   all         70       2670      0.782       0.74      0.768      0.435

      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
    101/200      2.13G      1.174     0.5626     0.9951        362        480: 100%|██████████| 18/18 [00:03<00:00,  5.74it/s]
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:00<00:00,  3.36it/s]
                   all         70       2670       0.76      0.759      0.774      0.431
EarlyStopping: Training stopped early as no improvement observed in last 20 epochs. Best results observed at epoch 81, best model saved as best.pt.
To update EarlyStopping(patience=20) pass a new patience value, i.e. `patience=300` or use `patience=0` to disable EarlyStopping.

101 epochs completed in 0.161 hours.
Optimizer stripped from yolo\yolov11\train\palmeras_imgsz480_epochs200_r0\weights\last.pt, 5.4MB
Optimizer stripped from yolo\yolov11\train\palmeras_imgsz480_epochs200_r0\weights\best.pt, 5.4MB

Validating yolo\yolov11\train\palmeras_imgsz480_epochs200_r0\weights\best.pt...
Ultralytics 8.3.12  Python-3.11.9 torch-2.4.1+cu124 CUDA:0 (NVIDIA GeForce RTX 3060 Laptop GPU, 6144MiB)
YOLO11n summary (fused): 238 layers, 2,582,542 parameters, 0 gradients, 6.3 GFLOPs
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 3/3 [00:03<00:00,  1.05s/it]
                   all         70       2670       0.78      0.743      0.786      0.445
                  Tree         43        325      0.625      0.583      0.618      0.291
                  Palm         66       2345      0.935      0.903      0.954      0.598
Speed: 0.5ms preprocess, 2.2ms inference, 0.0ms loss, 4.8ms postprocess per image
Results saved to yolo\yolov11\train\palmeras_imgsz480_epochs200_r0
Out[ ]:
ultralytics.utils.metrics.DetMetrics object with attributes:

ap_class_index: array([0, 1])
box: ultralytics.utils.metrics.Metric object
confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x000001DDC6E4A910>
curves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)']
curves_results: [[array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,  0.00061048,  0.00030524,           0],
       [          1,           1,           1, ...,    0.014816,   0.0074078,           0]]), 'Recall', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.065083,    0.065083,      0.0857, ...,           0,           0,           0],
       [    0.33878,     0.33878,     0.45839, ...,           0,           0,           0]]), 'Confidence', 'F1'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.033777,    0.033777,    0.045027, ...,           1,           1,           1],
       [    0.20513,     0.20513,     0.29997, ...,           1,           1,           1]]), 'Confidence', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.88923,     0.88923,     0.88615, ...,           0,           0,           0],
       [    0.97228,     0.97228,     0.97143, ...,           0,           0,           0]]), 'Confidence', 'Recall']]
fitness: 0.4788239075167768
keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']
maps: array([    0.29108,     0.59824])
names: {0: 'Tree', 1: 'Palm'}
plot: True
results_dict: {'metrics/precision(B)': 0.7800100413229583, 'metrics/recall(B)': 0.7431596755850487, 'metrics/mAP50(B)': 0.7863008086461691, 'metrics/mAP50-95(B)': 0.44465980739128874, 'fitness': 0.4788239075167768}
save_dir: WindowsPath('yolo/yolov11/train/palmeras_imgsz480_epochs200_r0')
speed: {'preprocess': 0.48579829079764225, 'inference': 2.2145680018833707, 'loss': 0.0, 'postprocess': 4.814679282052176}
task: 'detect'

Verificación de Resultados¶

Como parte de los resultados de entrenamiento, YOLO también valida los mismos, de acuerdo con lo descrito en el archivo de configuración (.yaml). A continuación una comparación de las pérdidas en entrenamiento vs. validación, y la precisión mAP50 (accuracy con un IoU de 0.5).

In [ ]:
# Se grafican los resultados de los modelos YOLOv8
ruta_resultados_csv_1 = os.path.join(RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN, f'{NOMBRE_MODELO_YOLO}_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}_r0', 'results.csv')
ruta_resultados_csv_2 = os.path.join(RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN, f'{NOMBRE_MODELO_YOLO}_imgsz{YOLO_IMG_SIZE_2}_epochs{YOLO_EPOCHS_2}_r0', 'results.csv')
ruta_resultados_v11_csv_1 = os.path.join(RUTA_CARPETA_YOLOV11_PROYECTOS_TRAIN, f'{NOMBRE_MODELO_YOLO}_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}_r0', 'results.csv')
resultados_df_1 = pd.read_csv(ruta_resultados_csv_1)
resultados_df_2 = pd.read_csv(ruta_resultados_csv_2)
resultados_df_v11_1 = pd.read_csv(ruta_resultados_v11_csv_1)

# Se grafican los resultados para las pérdidas en entrenamiento y validación
plt.figure(figsize=(12, 6))
plt.plot(resultados_df_1['epoch'], resultados_df_1['train/box_loss'], label=f'Pérdida en Entrenamiento - YOLOv8_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}', color='blue')
plt.plot(resultados_df_1['epoch'], resultados_df_1['val/box_loss'], label=f'Pérdida en Validación - YOLOv8_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}', linestyle='--', color='blue')
plt.plot(resultados_df_2['epoch'], resultados_df_2['train/box_loss'], label=f'Pérdida en Entrenamiento - YOLOv8imgsz{YOLO_IMG_SIZE_2}_epochs{YOLO_EPOCHS_2}', color='green')
plt.plot(resultados_df_2['epoch'], resultados_df_2['val/box_loss'], label=f'Pérdida en Validación - YOLOv8imgsz{YOLO_IMG_SIZE_2}_epochs{YOLO_EPOCHS_2}', linestyle='--', color='green')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.title('Périda en Entrenamiento vs Validación para los modelos YOLOv8')
plt.legend()
plt.grid(True)
plt.show()

# Se grafican los resultados para el mAP50
plt.figure(figsize=(12, 6))
plt.plot(resultados_df_1['epoch'], resultados_df_1['metrics/mAP50(B)'], label=f'mAP50 - YOLOv8_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}', color='blue')
plt.plot(resultados_df_2['epoch'], resultados_df_2['metrics/mAP50(B)'], label=f'mAP50 - YOLOv8_imgsz{YOLO_IMG_SIZE_2}_epochs{YOLO_EPOCHS_2}', color='green')
plt.xlabel('Epoch')
plt.ylabel('mAP50')
plt.title('mAP50 para los modelos YOLOv8')
plt.legend()
plt.grid(True)
plt.show()

# Se grafinca una comparación de mAP50 para los modelos YOLOv8 y YOLOv11
plt.figure(figsize=(12, 6))
plt.plot(resultados_df_1['epoch'], resultados_df_1['metrics/mAP50(B)'], label=f'mAP50 - YOLOv8_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}', color='blue')
plt.plot(resultados_df_v11_1['epoch'], resultados_df_v11_1['metrics/mAP50(B)'], label=f'mAP50 - YOLOv11_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}', color='red')
plt.xlabel('Epoch')
plt.ylabel('mAP50')
plt.title('mAP50 para los modelos YOLOv8 y YOLOv11')
plt.legend()
plt.grid(True)
plt.show()
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

Se puede observar que una imagen de mayor tamaño ofrece mejores resultados en términos de mAP50. Sin embargo, esto trae un costo computacional mayor; por lo que depende del hardware del que se disponga.

En cuanto a la precisión del modelo v8 con el más reciente v11, no se nota mucha diferencia. Sin embargo, la mejora del modelo puede encontrarse en las versiones de mayor tamaño (y complejidad).

Inferencia¶

In [20]:
RUTA_CARPETA_YOLOV8_BEST_TRAINED = os.path.join(RUTA_CARPETA_YOLOV8_PROYECTOS_TRAIN, f'{NOMBRE_MODELO_YOLO}_imgsz{YOLO_IMG_SIZE_1}_epochs{YOLO_EPOCHS_1}_r0', 'weights', 'best.pt')
RUTA_CARPETA_YOLOV8_BASE = 'yolov8n.pt'

# Se cargan los mejores pesos del modelo entrenado
inf_model_yolo_v8 = YOLO(RUTA_CARPETA_YOLOV8_BEST_TRAINED)
inf_model_yolov8n = YOLO(RUTA_CARPETA_YOLOV8_BASE)

IMAGENES_PARA_INFERENCIA_PATH = os.path.join(RUTA_CARPETA_RESOURCES, 'imgs', 'inferencia')
RESULTADOS_INFERENCIA_PATH = os.path.join(IMAGENES_PARA_INFERENCIA_PATH, 'yolo_results')

# Se lanzan las inferencias
#set_imagenes = ['mvd_001.jpg']
results = inf_model_yolo_v8(IMAGENES_PARA_INFERENCIA_PATH, classes=[1], save=True, project=RESULTADOS_INFERENCIA_PATH, name='yolov8_retrained')
results_base = inf_model_yolov8n(IMAGENES_PARA_INFERENCIA_PATH, save=True, project=RESULTADOS_INFERENCIA_PATH, name='yolov8_base') # Clases por defecto

# Se procesan los resultados
for result in results:
    boxes = result.boxes  # Objeto Boxes para las salidas de detección
    masks = result.masks  #  Objet Mask para las salidas de segmentación
    keypoints = result.keypoints  # Objeto Keypoints para las salidas de posición
    probs = result.probs  # Objeto Probs para las salidas de clasificación
    obb = result.obb  # Objeto OBB (oriented bounding box)
    result.show() 
  


for result in results_base:
    boxes = result.boxes  # Objeto Boxes para las salidas de detección
    masks = result.masks  #  Objet Mask para las salidas de segmentación
    keypoints = result.keypoints  # Objeto Keypoints para las salidas de posición
    probs = result.probs  # Objeto Probs para las salidas de clasificación
    obb = result.obb  # Objeto OBB (oriented bounding box)
    result.show()
image 1/3 d:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\imgs\inferencia\inf_001.jpg: 544x640 40 Palms, 8.6ms
image 2/3 d:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\imgs\inferencia\inf_002.jpg: 416x640 32 Palms, 9.4ms
image 3/3 d:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\imgs\inferencia\inf_003.jpg: 480x640 39 Palms, 9.3ms
Speed: 3.2ms preprocess, 9.1ms inference, 1.3ms postprocess per image at shape (1, 3, 480, 640)
Results saved to resources\imgs\inferencia\yolo_results\yolov8_retrained

image 1/3 d:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\imgs\inferencia\inf_001.jpg: 544x640 (no detections), 8.2ms
image 2/3 d:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\imgs\inferencia\inf_002.jpg: 416x640 1 potted plant, 11.0ms
image 3/3 d:\ProyectosUBA\CEIA\5_Vision_por_Computadora_II\repo_bruno\uba-ceia\ceia-vpc2\resources\imgs\inferencia\inf_003.jpg: 480x640 (no detections), 9.0ms
Speed: 2.0ms preprocess, 9.4ms inference, 1.7ms postprocess per image at shape (1, 3, 480, 640)
Results saved to resources\imgs\inferencia\yolo_results\yolov8_base

🔮 Función auxiliar:

Función que recibe una lista de imágenes en formato de tensores y las grafica en una fila.

In [38]:
def show_images_in_line_from_tensors(tensor_images: List[torch.tensor]) -> None:
    """
    Función que recibe una lista de imágenes en formato de tensores y las grafica en una fila.

    Parámetros:
    - tensor_images: Lista de tensores que representan las imágenes (deben ser exactamente 3 tensores).
    """
    if len(tensor_images) != 3:
        raise ValueError("La función requiere exactamente tres tensores de imágenes.")
    
    # Convertir a formato PIL para visualización con Matplotlib
    images = [T.ToPILImage()(img) for img in tensor_images]
    
    # Crear la figura con tres subplots
    fig, axes = plt.subplots(1, 3, figsize=(15, 5))
    
    # Mostrar cada imagen en su subplot correspondiente
    for i, img in enumerate(images):
        axes[i].imshow(img)
        axes[i].axis('off')  # Ocultar los ejes
    
    # Mostrar las imágenes
    plt.show()

Para realizar una validación previa del modelo, se utilizan imágenes obtenidas de servicios satelitales que nunca fueron vistas por el mismo. Las imágenes bases son:

In [39]:
base_dir = os.path.join(RUTA_CARPETA_RESOURCES, 'imgs', 'inferencia')
img_base_list = [read_image(os.path.join(base_dir, img_path)) for img_path in os.listdir(base_dir) if img_path.endswith(FORMATOS_IMAGENES)]


show_images_in_line_from_tensors(img_base_list)
No description has been provided for this image

Predicciones del modelo YOLOv8 base (sin fine-tuning):

In [40]:
base_dir = os.path.join(RUTA_CARPETA_RESOURCES, 'imgs', 'inferencia', 'yolo_results', 'yolov8_base')
img_base_list = [read_image(os.path.join(base_dir, img_path)) for img_path in os.listdir(base_dir) if img_path.endswith(FORMATOS_IMAGENES)]

show_images_in_line_from_tensors(img_base_list)
No description has been provided for this image

Predicciones del modelo YOLOv8 con fine-tuning:

In [41]:
base_dir = os.path.join(RUTA_CARPETA_RESOURCES, 'imgs', 'inferencia', 'yolo_results', 'yolov8_retrained')
img_base_list = [read_image(os.path.join(base_dir, img_path)) for img_path in os.listdir(base_dir) if img_path.endswith(FORMATOS_IMAGENES)]

show_images_in_line_from_tensors(img_base_list)
No description has been provided for this image

Conclusiones¶

⚠️ Problemas detectados:

  • El tamaño de la imagen es un problema. Si se entrena con el tamaño muy grande, pueden haber problemas de memoria y el entrenamiento demora mucho. Se realizaron pruebas y la mayoría de las veces se tuvo este problema (Google Colab - Kaggle Notebooks). En cambio, si la imagen es muy chica al entrenar, baja la métrica de evaluación, dado que la red detecta menos caracteríasticas.
  • Los tamaños de los modelos, en el caso de Faster R-CNN, llegaron a ser de 400Mb, lo que limita su operabilidad en el borde (dispositivos).
  • La complejidad de entrenar un modelo full-pytorch es mucho mayor a la de utilizar modelos de frameworks como Yolo o MMDetection. Este tiempo hay que tener en consideración, si bien los modelos full-pytorch te da más versabilidad.
  • Cuando se busca información sobre la métria mAP, si bien es muy utilizada, no hay una estandarización más allá de las utilidades de pycocotools, lo que resulta un problema para adaptarla a un código estandar.
  • Cuando se realizó data-augmentation, no se tuvo buena efectividad utilizando pytorch, especialmente para las rotaciones y cambios de perspectivas, no se aplicaban correctamente a las bounding-boxes. Se tuvo una mejora con Albumentations, pero igualmente no quedaron perfectos. Un ejemplo de esto es la siguiente imagen:

Figura 5: Data-aumentation fallida.
Figura 5: Data-aumentation fallida.

⭐ Conclusión ⭐

Como conclusiones, se pudo observar que el tamaño de la imagen es claramente un hiperparámetro que está limitado por los recursos.

También la rigidez de los modelos pre-entrenados, así como su facilidad para utilizarlos out-of-the-box. En este sentido, se valora mucho frameworks como el de utltralitycs que es muy fácil su implementación en un dataset propio.

Otro punto importante es el consumo de memoria de estos modelos, en donde es importante, para casos de producción, estudiar una estrategia de entrenamiento distribuído, y también hacer las aumentaciones offline, dado que el modelo sin aumentaciones demoró mucho menos su entrenamiento.

Como conclusión final, se observa que hay varias matices que resolver, pero sin duda, estos enfoques sirven para aplicaciones en producción, destacando modelos como YOLO, debido a su alta performance y principalmente fácil implementación.

💫 Mejoras posibles:

  • Optimizar el entrenamiento y validación.
  • Probar la exportación del modelo utilizando torchscript o onnx para ver su tamaño.
  • Mejorar el etiquetado de los datos. Había casos en donde el etiquetado no contemplaba las buenas practicas del proceso.
  • Investigar más FiftyOne y su interacción con otras librerías de etiquetado, como CVAT.
  • Las imágenes actuales, se hacen un Rezise. Lo correcto sería, luego de procesarla (bboxes), hacer un rezise al tamaño original. Por ejemplo, tener una transformación para el caso de predicción, y aplicarla solo en dicho caso prediction=True
  • Investigar más el manejo de memora en GPU de pytorch.
  • Investigar más el tema de la métrica mAP, especialmente la implementación de "torch lighting".
  • Investigar más la detección específica utilizando imágenes grandes (SAHI -es para objetos pequeños, pero en una imagen grande muy grande, los objetos se vuelven pequeños).
  • La red de Faster R-CNN, tiene un componente de clasificación, llamado "backbone". Actualmente se está utilizando ResNet con una versión mejorada de sus pesos, sin embargo, queda abierta la posibilidad de probar otros "backbones" que sean el estado del arte actual (ej: efficientNet).
  • Investigar el problema de transformaciones espaciales en este conjunto, no funcionó ni con Albumentations ni con Pytorch.

Referencias¶

  • Adel Ammar, and Anis Koubaa. (2023). Aerial images of palm trees [Data set]. Kaggle. https://doi.org/10.34740/KAGGLE/DSV/6382990
  • TorchVision Object Detection Finetuning Tutorial — PyTorch Tutorials 2.4.0+cu121 documentation. (n.d.). https://pytorch.org/tutorials/intermediate/torchvision_tutorial.html

Anexo¶

Si hay problemas de GPU, probar con las siguientes soluciones:

!pip install GPUtil

import torch
from GPUtil import showUtilization as gpu_usage
from numba import cuda

def free_gpu_cache():
    print("Initial GPU Usage")
    gpu_usage()                             

    torch.cuda.empty_cache()

    cuda.select_device(0)
    cuda.close()
    cuda.select_device(0)

    print("GPU Usage after emptying the cache")
    gpu_usage()

free_gpu_cache()
import gc def report_gpu(): print(torch.cuda.list_gpu_processes()) gc.collect() torch.cuda.empty_cache()
# En el entrenamiento:
def train_loop(files, batch_size, model):
    size = len(files)
    for batch in range(size//batch_size):
        im_paths = files[batch*batch_size : (batch+1)*batch_size]
        X,y = load_batch(im_paths)
        # Compute prediction and loss
        output = model(X, y)
        del X
        del y
        gc.collect()
        torch.cuda.empty_cache()
# Chequear donde está el modelo
next(model.parameters()).device

# Chequear donde está un tensor
tensor.device